30 December 2014

Intranet stats - a neglected security metric

Most organizations of any size have a corporate intranet and I suspect you, dear reader, have an information or IT security website on yours.

Are you tracking the page views?

The count, or rather the trend in the number of page views for the security site can be an interesting, useful, perhaps even PRAGMATIC metric in its own right.

Take this very blog for example. Google kindly tracks and conveniently provides the admins with page view statistics in the form of little blue graphs. Google's default stats view shows the daily page counts for the present month, something like this:

Given the specialist nature of security metrics and our relatively narrow (distinguished, enlightened and very welcome!) readership, the default graph is too peaky, whereas it is a little easier to identify trends from the monthly version:


Pulling further back, the aggregated annual stats follow a pretty clear pattern which we've picked out by eye in red just in case you missed it:
The book had not even been printed when we launched this blog back in 2012. Interest peaked when it was published in January 2013, then declined gently until a few months ago when we are delighted to report that the upward trend resumed quite strongly - a second wave maybe.

Of course in spotting the second wave we might be exhibiting 'confirmation bias', one of many biases noted in the book, and if it mattered we really ought to consult a qualified statistician to analyze the numbers scientifically rather than 'by eye' ... but this is merely an illustration, and 'by eye' is good enough for our purposes. We're plenty patient enough to wait the months it will take to determine whether the apparent upward trend turns out to be genuine or just wishful thinking!

Turning now to your intranet security site, page counts like these are just one of a wide variety of statistics that the intranet webserver almost certainly tracks for you. Most webservers record far more information in their logs, while if you choose to go that route, dedicated tracking and analytical applications (such as Google Analytics for publicly-accessible sites at least) offer a bewildering array of statistics concerning things such as the pages visited, time spent on each page, website assets downloaded, the sequence of pages visited, browser versions, visitor locations and more. True, some of those details can be witheld or faked by the more security-conscious or paranoid visitors, but that's even less likely on an intranet than on the WWW so can be safely ignored in this context.

The big question, as always, is "Which metrics are actually worth the effort?" It costs real money to gather, anlyze, present and consider metrics, so as with any other business activity, they need to earn their keep. Figuring out the answer, as always, involves first understanding the organization's goals or objectives for the intranet site, then elaborating on the questions arising, then identifying potential metrics, and finally down-selecting a few cost-effective metrics using an approach such as PRAGMATIC.

It can be quite interesting and useful to elaborate on the objectives for an intranet site, although we seldom bother, while the questions arising are equally revealing.  One might well ask, for example:
  • What constitutes a successful security intranet site?  How do we define success?  What are we trying to achieve?
  • What proportion of employees visit the site over the course of, say, a year?
  • Which parts of the site are the most or least popular ... and why?
  • Which pages are the most or least "sticky" or engaging ... and why? 
  • How does information security's intranet site stack up against other business departments?
  • Do visits to the site reflect awareness and training initiatives, or incidents, or something else (can we explain the patterns or trends)?
  • Are certain groups or categories of employee more or less likely than others to browse the site? 
  • ... 
... Once you start, it's not hard to come up with a list of objectives, a set of questions and (implicitly at least) a suite of possible metrics. If you find yourself short of inspiration, this is an ideal task for a metrics workshop where participants feed and feed off each other.

Analyzing the possible metrics to identify those you intend to use can also be done in the workshop setting, pragmatically on scratchpads, or more formally using spreadsheets and documents that get circulated for comment. 

Of course, if all of that is too hard, you can probably get what you need, for now, from the page counts ... so track them, and don't forget to check and ponder the stats every so often. It's as a good place to start as any.
Happy new year!
Gary

20 December 2014

Management awareness paper on email security metrics

Measuring the information security aspects of email and indeed other forms of person-to-person messaging implies first of all that you understand what your security arrangements are intended to achieve.  What does it mean to "secure email"?  If that's too hard to answer, turn it on its head: what might be the consequences of failing adequately to secure email? Does that help?

Our next metrics discussion paper opens with a brief analysis of the 'requirements and targets', also known as the objectives, of email security, expressed in broad terms. For instance, preventing or at least reducing the issues relating to or arising from spam and malware is a common objective ... hence one might want to measure spam and email-borne malware, among other aspects. 

That in turn begs questions about which specific parameters to measure and how - for instance, there are many possible ways to measure spam, such as the:
  • Number of spam emails arriving at the organization, or rather the rate of arrival (spams per hour, day, month or whatever);
  • Number of spam emails leaving the organization (!);
  • Types of spams detected, perhaps analyzed according to the differing threats they represent, ranging perhaps from trivial/timewasting to targeted spear-phishing attacks;
  • Proportion of spam that is detected and blocked versus that passed to users (and, hopefully, identified by them as spam);
  • Cost of anti-spam controls, including the anti-spam software and systems, network and system load, user and support time spent on the problem etc.
For each of those potentially interesting concerns, there may be several actual ways to generate the corresponding measures, and several ways to analyze, present and hopefully use the data. The paper outlines just a few examples to illustrate the approach, but it should be obvious from the above that we could easily have written a very long and boring treatise just on email security metrics. We didn't ... because the paper was 'just' written for awareness purposes: it is designed to get managers thinking and talking about security metrics, opening their eyes to the possibilities and encouraging them to consider their own, specific information needs as opposed to the few generic examples given. Rather than boring the pants off our readers, we're hoping to intrigue and stimulate them, catch their imagination not write a book on it.  [By the way, that's one of the objectives for the security awareness program, implying that perhaps we might measure it in order to confirm whether the awareness program is achieving the objective, and if not suggest how we might do better!]

In case you missed it, there's an important point here with implications for all metrics (not just infosec metrics!). What we choose to measure depends on our information needs, organizational circumstances, objectives, challenges, maturity level and so on.  Your situation is different, hence you will probably be better served by a different set of metrics. Much as we would like to offer you a little pre-canned set of email security metrics, chances are they won't work terribly well for you. They'll be sub-optimal, perhaps costly, distracting and generally unhelpful. Remember this when anyone enthuses about particular security metrics, or when someone naively posses the classic "What metrics are best for X?"

One might argue that since we share a number of common information security challenges (such as spam), we might benefit from a number of common metrics ... but that's overly simplistic, just as it would be nuts to suggest that we should all employ identical anti-spam controls. Some email security metrics might well turn out to be more or less common than others but that alone is not a sensible reason to select or avoid them respectively.

This is why we invented the PRAGMATIC approach. While it is easy to come up with long lists of possible metrics using methods such as Goal-Question-Metric (see Lance Hayden's book "IT Security Metrics") and metrics catalogs (e.g. the 'consensus security metrics' from CIS), there was a distinct lack of guidance on how to shortlist and finally choose the few metrics actually worth implementing.

14 December 2014

Management awareness paper on trade secret metrics


Protecting proprietary information, especially trade secrets, is - or rather should be - a priority for almost all organizations.

Trade secrets can be totally devalued if they are disclosed to or stolen by competitors, if that leads to their being exploited. The loss of competitive advantage can decimate an organization's profitability and, in the worst case, threaten its survival.

Availability and integrity are also of concern for proprietary information. If the information is destroyed or lost, the organization can no longer use it. If it is damaged or corrupted, perhaps even deliberately manipulated, the organization might continue to use it but is unlikely to find it as valuable.

Significant information security risks associated with proprietary information imply the need for strong, reliable information security controls, which in turn implies the need to monitor the risks and controls proactively. Being just 3 pages long, the awareness paper barely introduces a few metrics that could be used to measure the organization's security arrangements for trade secrets in 5 or 6 categories: its purpose is not to specify individual metrics so much as to lift the covers on a few possibilities. Your organization needs to determine its own security measurement objectives, adopting metrics that suit your purposes. Hopefully, this brief security awareness paper will stimulate thought and discussion on metrics: where it leads from there is down to you.

05 December 2014

Management awareness paper on authentication metrics

User identification and authentication (I&A) is a key information security control for all systems, even those that allow public access (unless the general public are supposed to be able to reconfigure the system at will!). As such, it is important to be sure that I&A is working properly, especially on business- or safety-critical systems, which in turn implies a whole bunch of things. I&A must be:

  • Properly specified;
  • Professionally designed;
  • Thoroughly tested and proven;
  • Correctly implemented and configured;
  • Used!;
  • Professionally managed and maintained;
  • Routinely monitored.
Strangely, monitoring is often neglected for key controls. You'd think it was obvious that someone appropriate needs to keep a very close eye on the organization's key information security controls, since (by definition) the risk of key control failure is significant ... but no, many such controls are simply implemented and left to their own devices. Personally, I believe this is a serious blind spot in our profession.  

If unmonitored key controls fail, serious incidents occur unexpectedly. In contrast, management has the opportunity (hopefully!) to spot and respond to the warning signs for key controls that are being routinely monitored and reported on using suitable metrics.  Security metrics, then, are themselves key controls.


The management-level awareness briefing paper briefly sets the scene by outlining common requirements for I&A. It then briefly describes four types of metric that might be used to monitor, measure and generally keep an eye on various aspects of the control. Perhaps the most interesting is the authentication failure rate ... but to be honest my thinking on metrics has progressed in the 7+ years since this paper was written. The metrics in the paper look naive and basic to me now. Since I'm updating the authentication awareness module this month, I'll be thinking up better I&A metrics when I rewrite the paper for NoticeBored subscribers ... perhaps even scoring them using the PRAGMATIC method ... oh and revising those awful graphics!