30 December 2014

Intranet stats - a neglected security metric

Most organizations of any size have a corporate intranet and I suspect you, dear reader, have an information or IT security website on yours.

Are you tracking the page views?

The count, or rather the trend in the number of page views for the security site can be an interesting, useful, perhaps even PRAGMATIC metric in its own right.

Take this very blog for example. Google kindly tracks and conveniently provides the admins with page view statistics in the form of little blue graphs. Google's default stats view shows the daily page counts for the present month, something like this:

Given the specialist nature of security metrics and our relatively narrow (distinguished, enlightened and very welcome!) readership, the default graph is too peaky, whereas it is a little easier to identify trends from the monthly version:

Pulling further back, the aggregated annual stats follow a pretty clear pattern which we've picked out by eye in red just in case you missed it:
The book had not even been printed when we launched this blog back in 2012. Interest peaked when it was published in January 2013, then declined gently until a few months ago when we are delighted to report that the upward trend resumed quite strongly - a second wave maybe.

Of course in spotting the second wave we might be exhibiting 'confirmation bias', one of many biases noted in the book, and if it mattered we really ought to consult a qualified statistician to analyze the numbers scientifically rather than 'by eye' ... but this is merely an illustration, and 'by eye' is good enough for our purposes. We're plenty patient enough to wait the months it will take to determine whether the apparent upward trend turns out to be genuine or just wishful thinking!

Turning now to your intranet security site, page counts like these are just one of a wide variety of statistics that the intranet webserver almost certainly tracks for you. Most webservers record far more information in their logs, while if you choose to go that route, dedicated tracking and analytical applications (such as Google Analytics for publicly-accessible sites at least) offer a bewildering array of statistics concerning things such as the pages visited, time spent on each page, website assets downloaded, the sequence of pages visited, browser versions, visitor locations and more. True, some of those details can be witheld or faked by the more security-conscious or paranoid visitors, but that's even less likely on an intranet than on the WWW so can be safely ignored in this context.

The big question, as always, is "Which metrics are actually worth the effort?" It costs real money to gather, anlyze, present and consider metrics, so as with any other business activity, they need to earn their keep. Figuring out the answer, as always, involves first understanding the organization's goals or objectives for the intranet site, then elaborating on the questions arising, then identifying potential metrics, and finally down-selecting a few cost-effective metrics using an approach such as PRAGMATIC.

It can be quite interesting and useful to elaborate on the objectives for an intranet site, although we seldom bother, while the questions arising are equally revealing.  One might well ask, for example:
  • What constitutes a successful security intranet site?  How do we define success?  What are we trying to achieve?
  • What proportion of employees visit the site over the course of, say, a year?
  • Which parts of the site are the most or least popular ... and why?
  • Which pages are the most or least "sticky" or engaging ... and why? 
  • How does information security's intranet site stack up against other business departments?
  • Do visits to the site reflect awareness and training initiatives, or incidents, or something else (can we explain the patterns or trends)?
  • Are certain groups or categories of employee more or less likely than others to browse the site? 
  • ... 
... Once you start, it's not hard to come up with a list of objectives, a set of questions and (implicitly at least) a suite of possible metrics. If you find yourself short of inspiration, this is an ideal task for a metrics workshop where participants feed and feed off each other.

Analyzing the possible metrics to identify those you intend to use can also be done in the workshop setting, pragmatically on scratchpads, or more formally using spreadsheets and documents that get circulated for comment. 

Of course, if all of that is too hard, you can probably get what you need, for now, from the page counts ... so track them, and don't forget to check and ponder the stats every so often. It's as a good place to start as any.
Happy new year!

20 December 2014

Management awareness paper on email security metrics

Measuring the information security aspects of email and indeed other forms of person-to-person messaging implies first of all that you understand what your security arrangements are intended to achieve.  What does it mean to "secure email"?  If that's too hard to answer, turn it on its head: what might be the consequences of failing adequately to secure email? Does that help?

Our next metrics discussion paper opens with a brief analysis of the 'requirements and targets', also known as the objectives, of email security, expressed in broad terms. For instance, preventing or at least reducing the issues relating to or arising from spam and malware is a common objective ... hence one might want to measure spam and email-borne malware, among other aspects. 

That in turn begs questions about which specific parameters to measure and how - for instance, there are many possible ways to measure spam, such as the:
  • Number of spam emails arriving at the organization, or rather the rate of arrival (spams per hour, day, month or whatever);
  • Number of spam emails leaving the organization (!);
  • Types of spams detected, perhaps analyzed according to the differing threats they represent, ranging perhaps from trivial/timewasting to targeted spear-phishing attacks;
  • Proportion of spam that is detected and blocked versus that passed to users (and, hopefully, identified by them as spam);
  • Cost of anti-spam controls, including the anti-spam software and systems, network and system load, user and support time spent on the problem etc.
For each of those potentially interesting concerns, there may be several actual ways to generate the corresponding measures, and several ways to analyze, present and hopefully use the data. The paper outlines just a few examples to illustrate the approach, but it should be obvious from the above that we could easily have written a very long and boring treatise just on email security metrics. We didn't ... because the paper was 'just' written for awareness purposes: it is designed to get managers thinking and talking about security metrics, opening their eyes to the possibilities and encouraging them to consider their own, specific information needs as opposed to the few generic examples given. Rather than boring the pants off our readers, we're hoping to intrigue and stimulate them, catch their imagination not write a book on it.  [By the way, that's one of the objectives for the security awareness program, implying that perhaps we might measure it in order to confirm whether the awareness program is achieving the objective, and if not suggest how we might do better!]

In case you missed it, there's an important point here with implications for all metrics (not just infosec metrics!). What we choose to measure depends on our information needs, organizational circumstances, objectives, challenges, maturity level and so on.  Your situation is different, hence you will probably be better served by a different set of metrics. Much as we would like to offer you a little pre-canned set of email security metrics, chances are they won't work terribly well for you. They'll be sub-optimal, perhaps costly, distracting and generally unhelpful. Remember this when anyone enthuses about particular security metrics, or when someone naively posses the classic "What metrics are best for X?"

One might argue that since we share a number of common information security challenges (such as spam), we might benefit from a number of common metrics ... but that's overly simplistic, just as it would be nuts to suggest that we should all employ identical anti-spam controls. Some email security metrics might well turn out to be more or less common than others but that alone is not a sensible reason to select or avoid them respectively.

This is why we invented the PRAGMATIC approach. While it is easy to come up with long lists of possible metrics using methods such as Goal-Question-Metric (see Lance Hayden's book "IT Security Metrics") and metrics catalogs (e.g. the 'consensus security metrics' from CIS), there was a distinct lack of guidance on how to shortlist and finally choose the few metrics actually worth implementing.

14 December 2014

Management awareness paper on trade secret metrics

Protecting proprietary information, especially trade secrets, is - or rather should be - a priority for almost all organizations.

Trade secrets can be totally devalued if they are disclosed to or stolen by competitors, if that leads to their being exploited. The loss of competitive advantage can decimate an organization's profitability and, in the worst case, threaten its survival.

Availability and integrity are also of concern for proprietary information. If the information is destroyed or lost, the organization can no longer use it. If it is damaged or corrupted, perhaps even deliberately manipulated, the organization might continue to use it but is unlikely to find it as valuable.

Significant information security risks associated with proprietary information imply the need for strong, reliable information security controls, which in turn implies the need to monitor the risks and controls proactively. Being just 3 pages long, the awareness paper barely introduces a few metrics that could be used to measure the organization's security arrangements for trade secrets in 5 or 6 categories: its purpose is not to specify individual metrics so much as to lift the covers on a few possibilities. Your organization needs to determine its own security measurement objectives, adopting metrics that suit your purposes. Hopefully, this brief security awareness paper will stimulate thought and discussion on metrics: where it leads from there is down to you.

05 December 2014

Management awareness paper on authentication metrics

User identification and authentication (I&A) is a key information security control for all systems, even those that allow public access (unless the general public are supposed to be able to reconfigure the system at will!). As such, it is important to be sure that I&A is working properly, especially on business- or safety-critical systems, which in turn implies a whole bunch of things. I&A must be:

  • Properly specified;
  • Professionally designed;
  • Thoroughly tested and proven;
  • Correctly implemented and configured;
  • Used!;
  • Professionally managed and maintained;
  • Routinely monitored.
Strangely, monitoring is often neglected for key controls. You'd think it was obvious that someone appropriate needs to keep a very close eye on the organization's key information security controls, since (by definition) the risk of key control failure is significant ... but no, many such controls are simply implemented and left to their own devices. Personally, I believe this is a serious blind spot in our profession.  

If unmonitored key controls fail, serious incidents occur unexpectedly. In contrast, management has the opportunity (hopefully!) to spot and respond to the warning signs for key controls that are being routinely monitored and reported on using suitable metrics.  Security metrics, then, are themselves key controls.

The management-level awareness briefing paper briefly sets the scene by outlining common requirements for I&A. It then briefly describes four types of metric that might be used to monitor, measure and generally keep an eye on various aspects of the control. Perhaps the most interesting is the authentication failure rate ... but to be honest my thinking on metrics has progressed in the 7+ years since this paper was written. The metrics in the paper look naive and basic to me now. Since I'm updating the authentication awareness module this month, I'll be thinking up better I&A metrics when I rewrite the paper for NoticeBored subscribers ... perhaps even scoring them using the PRAGMATIC method ... oh and revising those awful graphics!

17 November 2014

Management awareness paper on insider threat metrics

How do you measure 'insider threats' in your organization?  

If your answer is "We don't!", then I have to wonder how you are managing insider threats. 

Without suitable metrics, how do you figure out how much of a problem you might have from employees, contractors, consultants, temps and interns?  How do you determine where best to spend your security budget? How do you persuade management to loosen the purse strings sufficiently to address the risks?  I guess you guess!

The NoticeBored discussion paper breaks down 'insider threat' into chunks that can be measured sensibly.  The main divide falls between deliberate attacks (such as frauds by insiders) and accidents (such as mistakenly overwriting the entire production database - don't laugh, it happened to me 25 years ago and the nightmare still haunts me today!). 

The paper picks up on one of the most productive sources of information security metrics: the IT Help/Service Desk's problem and incident management process.  Most mid-to-large organizations these days route employee queries and problem reports through a centralized helpdesk function that acts as a clearing house, offering first-line support and routing unresolved calls to the appropriate resolving agencies for specialist help.  A valuable core part of their role is to ask questions and record information about calls in the ticketing system, tracking the calls through to completion.  The system, then, is a goldmine of information about queries, problems, events, incidents and other issues, including "near misses" (assuming the organization is sufficiently clued-up to encourage workers to report close-shaves as well as actual incidents).

If this is all Greek to you, find a spare hour or two to speak to a helpdesk supervisor about the call-ticketing system, about how calls are categorized and recorded, and how to squeeze suitable reports out of the system ... but, before you get completely carried away on a wave of enthusiasm, planning how to use all that sexy statistical and textual information, think very carefully about what you are doing.  The availability of information is not, in fact, the main concern when designing security metrics.  A far more important issue is to figure out what you want to know, what questions need to be answered.  Conceiving questions that fit the available answers is putting the cart before the horse.

12 November 2014

Management awareness paper on network security metrics

Measuring network security involves, first and foremost, determining what 'network security' encompasses, and how it relates to the business.

Writing way back in 2007, we said that network security "comprises a range of technical and procedural controls designed to prevent, detect and/or recover from security incidents affecting the corporate data networks – incidents such as unauthorized access (hacking), worms and other malware infections, and unplanned network downtime".  The context for the paper was a NoticeBored information security awareness module exploring security arrangements protecting data networks against both deliberate and accidental threats.

The paper described ways to measure network security incidents, controls, risks, compliance and governance.  It ended with an upbeat conclusion and call-to-action: "Do not neglect the value of having the experts present and discuss reports with management.  The dialogue that ensues adds value to the written reports.  Why not present and discuss these ideas with your management and seek their opinions, bringing to the table some prototype reports in one or more formats to stimulate discussion and clarify their objectives?"

The PRAGMATIC approach is a structured, systematic way to consider the pros and cons of various metrics, leading ultimately to a decision on which ones (if any!) to adopt.  Just as important as the final destination, the method leads managers and infosec pro's together on a journey of discovery.

My guess is that the network security incident and compliance metrics would probably score above the others proposed in the paper but I can't tell for sure because I don't know a thing about your situation or measurement needs: your evaluation may well come to a markedly different conclusion.  Furthermore, in discussing the metrics paper, you might come up with 'variations on a theme', meaning variants of the metrics proposed that would score more highly, and perhaps something completely different, especially if it turns out that none of the metrics in the paper are suitable.  

That spark of creativity is the real power of PRAGMATIC.  Scientifically analyzing the factors determining the strength and suitability of the metrics under discussion leads to a better understanding of the metrics and of the measurement needs.  Contrast that with the blank-sheet approach.  Faced with the question "What network security metrics shall we adopt?", most business managers would be at a loss to know how to proceed.  If the security, network or IT people suggest one or more technical security metrics to fill the void (perhaps plucked from the air or picked out of some banale list with a pin), the managers don't have the basis for evaluating them, meaning that they are quite likely to accept whatever is offered, perhaps later coming to realize that they aren't exactly ideal.  If the metrics truly don't work, it's back to the drawing board to pick some more ... assuming someone has the good sense to call a halt to the senseless waste of time and effort (that's not a facile comment: many organizations drift aimlessly along for years with inappropriate, unsuitable or low-value metrics because everyone assumes someone else must be using them!*).

Although it is possible for the organization to develop a decent set of metrics by sheer trial-and-error, there is a distinct chance that good metrics will be discarded or discounted for arbitrary reasons, and that opportunities to 'tweak' metrics the better to fit the business needs will be missed.  Using the PRAGMATIC method as a systematic process to select, develop and maintain suitable metrics has to be a better way, don't you think?

Kind regards,

* That reminds me: have you audited your organization's security metrics? It may be a tedious assignment but a competent and diligent auditor should be able to follow the paper trail from gathering the source data through analysis and reporting to the decisions arising - if any.  If the trail goes cold, that's a strong indication that the metrics are simply not working, implying not just a waste of effort and money but an information void and lost opportunity.  Low quality metrics are a costly distraction, literally worse than useless!

05 November 2014

Management awareness paper on malware metrics

Malware - malicious software - encompasses a variety of computer viruses, Trojans, network worms, bots and other nasties.  

Malware has been the scourge of IT users ever since the Morris worm infected the early Internet way back in 1988.  Despite the enormous global investment over the intervening years in information security controls against malware (including security awareness!), it remains a significant security concern today. 

Although antivirus software companies sometimes admit that they are fighting a losing battle, malware is generating so much income both for the VXers (malware authors) and their criminal masterminds, plus the antivirus software companies, that the arms race looks set to continue for the forseeable future.  Both sides are constantly investing in new tricks and techniques, fuelling a thriving black market in zero-day exploits and novel malware.

Meanwhile, the rest of us are lumbered with paying for it in one way or another. Addressing the malware risk is more involved than simply throwing money at antivirus software: malware metrics are the key to understanding the balance of risk against control, investing wisely, and doing our level best to keep up with, if not stay one step ahead of the game in this dynamically evolving situation.

30 October 2014

Management awareness paper on database security metrics

The next NoticeBored security awareness paper suggests to management a whole bunch of metrics that might be used to measure the security of the organization's database systems.

Most information-packed application systems are built around databases, making database security a significant concern for the corporation.  We're talking about the crown jewels, the bet-the-farm databases containing customer, product and process information, emails, contracts, trade secrets, personal data and so much more.  Despite the importance of database security, we don't know of any organization systematically measuring it ... although we do know of many that struggle to keep on top of database security design, development, testing, patching, administration and maintenance!

So how exactly are management supposed to manage database security without database security measures? Extra sensory perception, perhaps, or gut-feel? Either way, it's hardly what one might call scientific management!

Download the paper here.  We'd be fascinated in your thoughts.  Do any of these measures catch your imagination?  What other database security metrics or measurement approaches would you suggest?  What do you use?

22 October 2014

Management awareness paper on IPR metrics

When we get a spare moment over forthcoming months, we plan to release a series of awareness papers describing metrics for a wide variety of information security topics through the SecurityMetametrics website.
The first paper, dating back to 2007, proposes a suite of information security management metrics relating specifically to the measurement of Intellectual Property Rights (IPR). Managing and ideally optimizing IPR-related controls (namely the activities needed to reduce the chances of being prosecuted by third parties for failing to comply with their copyright, patents, trademarks etc. plus those necessary to protect the organization's own IPR from abuse by others), requires management to monitor and measure them and so get a sense of the gap between present and required levels of control, apply corrective actions where necessary and improve performance going forward.
These metrics papers were originally delivered to subscribers of the NoticeBored security awareness service, as part of the management stream.  Their primary purpose is to raise awareness of the monthly topic, but really we hope to encourage information security professionals and management to think about, discuss and perhaps adopt better security metrics.  

If you follow the sequence, you'll notice our own thinking change over the 7 years since this first paper, particularly while PRAGMATIC Security Metrics was being written.  From time to time, we introduced new styles of metric, often covering the same information security topics repeatedly but from slighly different angles (there are currently 50 infosec topics in the NoticeBored portfolio, with still more to come).

If you'd like to discuss any of these papers, please comment here on the blog or through Google Plus.

09 August 2014

Hot crazy matrix

The universal hot crazy matrix is an amusing demonstration of the power of presenting numeric data in graphical form, extracting meaningful information from the data in order to lift the discussion off the page. We shall have to include it in our security metrics course.

Non-PC sexist humour aside, the presenter's knowledge and passion for the subject are undeniable.  Contrast that enthusiastic, lively presentation with the dull, ponderous, matter-of-fact way we normally present information security and other business metrics. 'Nuff said. For more, come on the course!

10 July 2014

Pragmatic use for a PRAGMATIC security metric

I have just published a tool developed by Ed Hodgson, Marty Carter and me to help people estimate how long their ISO/IEC 27001 ISMS implementation projects will take.

The tool is an Excel spreadsheet (DOWNLOAD). As with the remainder of the ISO27k Toolkit, it is free to use and is covered by a Creative Commons license. I will roll it into the Toolkit when the Toolkit is next updated.

The estimated project timescale depends on how you score your organization against a set of criteria - things such as the extent to which management supports the ISMS project, and its strategic fit. The scoring process uses a percentage scale with textual descriptions at four points on the scale, similar to those Krag and I described in PRAGMATIC Security Metrics. The criteria are weighted, since some are way more important than others. The scores you enter either increase or decrease the estimated timescale from a default value, using a model coded into the spreadsheets.

Ed enhanced my original model with a more sophisticated method of calculation: Ed’s version substantially extends the timescale if you score low against any criteria, emphasizing the adverse impact of issues such as limited management support and strategic fit. I have left both versions of the model in the file so you can try them both and compare them to see which works best for you … and of course you can play with the models, the criteria and the weightings as well as the scores. I suspect that Ed’s version is more accurate than mine, but maybe both are way off-base. Perhaps we have neglected some factor that you found critical?  Perhaps the weightings or the default timescale are wrong?  If you have successfully completed ISMS implementation projects, please take a look at the criteria and the models, and maybe push your numbers through to see how accurate the estimations would have been.

Feedback comments are very welcome – improvement suggestions especially – preferably on the ISO27k Forum for the benefit of the whole community, otherwise directly to me if you’re shy.

I’m afraid we haven’t yet managed to figure out how to estimate the resourcing (man-days) needed for the implementation project, as we originally planned. A couple of approaches have been suggested (such as breaking down the requirements in ISO/IEC 27001 to identify the activities and competences/skills needed) but it will take more effort to turn the suggestions into a practical tool. If you are inspired to have a go at developing a suitable tool, please make a start and I can set up another collaborative project on Google Docs to continue the development. Further general suggestions are fine but we really need something more concrete to sink our teeth into – a draft or skeleton resourcing estimator would be good. How would you go about it?

Gary Hinson  (Gary@isect.com)  

07 July 2014

ASIS report on security metrics

"Persuading Senior Management with Effective, Evaluated Security Metrics" is a lengthy new research report from ASIS Foundation, a membership body for (primarily) physical security professionals.

Quoting from the report's executive summary:
"Security metrics support the value proposition of an organization’s security operation. Without compelling metrics, security professionals and their budgets continue largely on the intuition of company leadership. With metrics, the security function grounds itself on measurable results that correlate with investment, and the security professional can speak to leadership in a familiar business language."
Fair enough.  That's similar to what we wrote in PRAGMATIC Security Metrics, in referring to measurable results and business orientation.
"Security metrics are vital, but in the field and in the literature one finds few tested metrics and little guidance on using metrics effectively to inform and persuade senior management." 
I'm not sure I agree that there are 'few tested security metrics' - it all depends on what we mean by 'tested' and 'security metrics'. I know there are hundreds of information security metrics in use, scores of which I believe are relatively widespread. I also know how easy it is to specify literally thousands of potential information security metrics covering various aspects and facets of our field, especially if one considers variants of any given metric as distinct metrics (e.g. 'the malware threat' could be measured in dozens of different ways, and each of those measures could be expressed or reported in dozens of ways, implying several gross of malware threat metrics).

We described and evaluated about 150 information security metrics examples in PRAGMATIC Security Metrics, and mentioned or hinted at numerous variants that might address some of the shortcomings we found in the examples.
"To address the gap, in spring 2013 the ASIS Foundation sponsored a major research project designed to add to the body of knowledge about security metrics and to empower security professionals to better assess and present metrics. The Foundation awarded a grant to Global Skills X-change (GSX), partnered with Ohlhausen Research, to carry out the project." 
GSX, tagline "Define. Measure. Optimize.", describes itself as "... a professional services firm that specializes in designing workforce education strategies and processes, which allow customers to meet their specific performance goals. The GSX core business model revolves around defining functional competency models and developing valid and reliable assessment tools as the foundation of credentialing and educational programs."

As to Olhausen Research, "A researcher in the security field for more than 25 years, [Peter Ohlhausen, President of Olhausen Research Inc.] has assisted in the multi-year revision of Protection of Assets, served as senior editor of Security Management magazine, and conducted numerous research and consulting projects for the U.S. Department of Justice, U.S. Department of Homeland Security, ASIS, and corporate clients." 
"This report provides the project’s findings, including its three practical, actionable products:
  • The Security Metrics Evaluation Tool (Security MET), which security professionals can self-administer to develop, evaluate, and improve security metrics 
  • A library of metric descriptions, each evaluated according to the Security MET criteria 
  • Guidelines for effective use of security metrics to inform and persuade senior management, with an emphasis on organizational risk and return on investment"
Security MET turns out to be a method for assessing and scoring metrics according to 9 criteria in 3 categories, described in some detail in Appendix A of the ASIS report:
Technical Criteria – Category 1
1. Reliability
2. Validity
3. Generalizability
Operational (Security) Criteria – Category 2
4. Cost 
5. Timeliness
6. Manipulation
Strategic (Corporate) Criteria – Category 3
7. Return on Investment
8. Organizational Relevance
9. Communication
I see interesting parallels to the 9 PRAGMATIC criteria we have described. Security MET requires users to score metrics on a 1-5 scale against each criterion using a 3 point scoring scale rather than the percentage scales with 4 scoring points that we prefer, but is otherwise the same process. It appears that we have converged on a generalized method for assessing or evaluating metrics. However, PRAGMATIC Security Metrics, and indeed other metrics books, are notably lacking from their "comprehensive review of the current state of metric development and application".  Information security is barely even mentioned in the entire report, an unfortunate omission given the convergence of our fields.

The ASIS report goes on to list just 16 [physical] security metrics, described as "authentic examples" that had been identified through the researchers' telephone interviews with respondents:
1. Office Space Usage Metric
2. Security Activity Metric
3. Environmental Risk Metric
4. Averted External Loss Metric
5. Security Audit Metric
6. Officer Performance Metric Panel
7. Security-Safety Metric
8. Security Incidents Metric
9. Personnel Security Clearance Processing Metric
10. Loss Reduction/Security Cost Metric
11. Operations Downtime Reduction Metric
12. Due Diligence Metric
13. Shortage/Shrinkage Metric
14. Phone Theft Metric
15. Security Inspection Findings Metric
16. Infringing Website Compliance Metric
These metrics have each been MET-scored by a handful of people, potentially generating statistics concerning consistency or reliability of the scoring method although the statistics are not actually provided in the report. There is no information reported concerning the method's consistency if the assessments are repeated by the same assessors. The authors do however mention that "The total score may suggest how close the metric is to attaining the highest possible score (45), but it is not likely to be useful for comparing different metrics, as the scoring would be different for users in different organizations.", suggesting that they are unhappy with the method's consistency between different assessors.  

Overall, this report is a worthwhile contribution to the security metrics literature.  Take a look!

18 June 2014

Another day, another survey, another ten failures

An article in an eZine concerning a security survey by PwC, sponsored by Iron Mountaincaught my eye today because they offer to benchmark respondents against others. So, purely in the interest of metrics research, I had a go at the benchmark tool.

First of all, the tool asked me for an email address without explaining why. Fail #1 (see also #6 below).

Thankfully, the email address validation routine is easily fooled. Fail #2 (or possibly Success #1 depending on one's perspective!).

Next the survey asked about 20 questions, mostly lame and some badly worded. There is no explanation about why those 20 questions have been selected. They address only a small part of information security. Fail #3.

All 20 questions have the same set of 4 possible multiple-choice answers, even though the stock answers don't cover all possibilities and don't even make sense for all the questions. The survey design is poor. Fail #4.

At the end of the survey, I was presented with a comparative "index score", in my case 41 (presumably 41%), along with a nasty day-glow bar chart and the following commentary:
"Your risk level is serious. Your score is well below the PwC recommended threshold, and it is only a matter of time before serious problems occur. Your business needs to take action now to improve information security and working practices. Read this report for further insight into your business risk and gain practical advice on how you can increase your index score and reduce your risk."
Our risk level is 'serious', eh?  Spot the scare tactics! FUD! Thanks to those 20 lame questions, they know next to nothing about our true situation, yet they presume to tell me we are "well below the PwC recommended threshold". Bloody cheek! Yes, we are facing various information security threats and have various information security vulnerabilities, and no, we have not implemented all the information security controls that might be appropriate for other organizations, but that's not the same as saying we are at serious risk. As my pal Anton would say, "Context is everything". Fail #5.

Next I was asked for yet more personal information in order to access the 'personalized report'. In reality, of course, this is clearly a marketing initiative so I know what they are up to, though again they don't actually say. Failing to explain why the information is needed conflicts with at least one of the OECD privacy principles concerning personal data collection and I think would be illegal under the privacy laws in most of the world outside the US. Fail #6.  

And again the data entry validation routines are weak. Fail #7.

The 'personalized report', "Your information risk profile", compares my score against "averages" (mean scores?) from the PwC/Iron Mountain survey in a PDF report. Generating the PDF on the fly is cool ... but the actual content is poor. The scope and purpose of the PwC survey are not stated in the 'personalized report', nor is the sample size or other basic information about the survey methods. The entire basis of the benchmarking is dubious, particularly if the PwC survey that generated the comparative data also used the lame 20 question multi-choice method. It's essentially meaningless drivel. Fail #8.

For no obvious reason, the 'personalized report' includes a page stating 4 "worrying facts", the first of which being "88% consider paper to be the biggest threat to information security". Eh? In all my years of information security risk management, I have NEVER heard paper being described as an information security threat. Paper is an asset usually of negligible value, although the information content on paperwork can be extremely valuable and a few, very rare bits of paper are priceless ... oh, hang on, Iron Mountain sponsored this survey, right. Ah yes, I remember, that's the same Iron Mountain whose archive facilities have suffered several serious fires including one recently in Buenos Aires. So much for their security credentials. It could be argued that Iron Mountain is "the biggest threat to [its customers'] information security"! Fail #9.

The 'key findings - your next steps' to the 'personalized report' kind of make sense in so far as they go, but bear no obvious relation to the benchmark, survey or the data.  Although for example I'm personally in favor of step 1 'Take it to the top - Get board level support by taking a strategic approach to information management', I'm also in favor of 'Don't run with scissors' and 'Don't smoke' which make about as much sense in this context. Fail #10.

As to the actual PwC/Iron Mountain survey, I encourage you to take a critical look at the survey report, and make of it what you will. Read past the annoying repetitive references to "the mid market" (which I think must be marketing-speak for medium-sized organizations) and the wrongly-labeled graphs. Set aside the spurious references to additional information and news headlines muddled in with the survey data, and the buzzwords-du-jour. Consider the "comprehensive questionnaire" of just 34 statements and the dubious statistics arising, and see what you have left.

Then read the report's disclaimer very carefully:
"This publication has been prepared for general guidance on matters of interest only, and does not constitute professional advice. You should not act upon the information contained in this publication without obtaining specific professional advice. No representation or warranty (express or implied) is given as to the accuracy or completeness of the information contained in this publication, and, to the extent permitted by law, PricewaterhouseCoopers LLP, its members, employees and agents do not accept or assume any liability, responsibility or duty of care for any consequences of you or anyone else acting, or refraining to act, in reliance on the information contained in this publication or for any decision based on it."
[PwC, I'm very disappointed in you. Your work on the UK DTI/BERR security surveys has been pretty good. Your consultants and auditors are generally competent and well-respected. What on Earth possessed you to get involved in this nonsense? Are you really that hard up these days?]

05 June 2014

Security metrics books

Dell security analyst Ben Knowles has reviewed and compared four information security metrics books:

  • Andrew Jaquith's Security Metrics (aka "the Treefrog book"!)
  • Caroline Wong's Security Metrics
  • Lance Hayden's IT Security Metrics
  • and ours, PRAGMATIC Security Metrics
Ben's comments are sound: while these books present differing perspectives and messages, all four have merit.  We discussed the first three books (and more) in the literature review in PRAGMATIC Security Metrics, and on SecurityMetametrics.com

06 May 2014

Enterprise Security Metrics report

A new 28-page research report by George Campbell's Security Executive Council (SEC) concerns the status of physical security metrics. Enterprise Security Metrics: A Snapshot Assessment of Practices (free but registration required) "provides a snapshot of the use of metrics in corporate security management. It includes information on the current state-of-the-art of various models of benchmarking and security metrics, types of metrics, judging the maturity of security metrics programs as well as challenges and opportunities for those undertaking security metrics programs. This report specifically summarizes our learned experience from corporate security measures and metrics initiatives."

The report refers to SEC's ongoing metrics research but unfortunately does not go into details about the methods.  A note on page 7 refers to a survey of 27 companies representing "a solid cross section of industry sectors [with] mature and multi-service corporate security programs, several engaging in best practice operations". The small sample was presumably drawn from members or clients of the SEC meaning that it was not random but self-selected from organizations with a clear interest in security metrics. Nevertheless, statistics aside, the findings and conclusions are well worth reading in more general terms - for example:
"Nearly 70 percent of respondents stated that they don’t collect security program metrics for the purposes of presenting to senior management ... This lack of engagement remains as a significant internal obstacle to metrics acceptance and development. Too many corporate security practitioners have either avoided or failed to understand the relevance of such measures. Security organizations have the data; they are willing to count events and other activity data but they apparently don’t see the need to use it to build actionable, influential metrics that can effectively influence senior management."
I like the phrase 'actionable, influential metrics'. Metrics that are neither actionable nor influential have little practical value. They are "coffee table metrics", the sorts of things one might idly skim through in a glossy magazine.  Metrics that are influential but not actionable can cause consternation: we know there is something wrong but we don't know how to fix it. Metrics that are actionable but not influential have no impact. Metrics that don't influence or support decisions are essentially pointless. Fow what it's worth, most such metrics tend to be cheap and easy to gather so the measurement costs are quite low, although there is a hidden cost in that they can be distracting, giving the impression that someone is on top of security metrics whereas in fact they are not.

The report mentions the commonplace KRIs and KPIs (Key Risk and Performance Indicators) plus two metrics that were new to me: Key Influence Indicators ("How do our metrics influence governance policy, business unit accountability and personal behavior?") and Key Value Indicators ("How have our metrics demonstrated tangible, actionable and measurable benefit to the enterprise?"). Influence and value are two of several characteristics of metrics, or metametrics. The PRAGMATIC method uses nine specific metametrics to determine the net value of a metric.

There is a common theme underlying the report's conclusions, namely that more effort should be put into identifying baseline metrics for all aspects of security in order to enable benchmarking comparisons between organizations. Security management practices and metrics requirements vary widely largely in practice because security risks vary widely, hence the particular security concerns that drive a given organization to select specific security metrics may not coincide with other organizations. However, an appendix to the SEC report offers a maturity metric measuring the status of an organization's [physical] security metrics program by assessing the anticipated parts of such a program. The metric is similar in style to those we described in PRAGMATIC Security Metrics, a form of metric that encourages us to break down and systematically assess complex situations within the organization (I found them well suited for internal audits and process improvement initiatives). Maturity metrics are also a promising approach for benchmarking comparisons of multiple organizations.

Another conclusion of the report is that metrics are needed for compliance assessment purposes: we discussed this point too in PRAGMATIC Security Metrics. Industry regulators and authorities (such as the other SEC!) need rational ways in which to measure and assess organizations on all sorts of criteria including governance, risk and security practices. The conventional approach is to specify and mandate certain requirements, in which case the measurement process boils down to someone (hopefully, a competent, independent and diligent third party) determining whether the stated requirements have or have not been fulfilled - fine in theory but harder to achieve in practice since there are so many variables. PCI-DSS, for instance, requires a number of specific security controls supposedly to secure cardholder data, and PCI assessments attempt to confirm that they are all in place. We know from Target and many other breaches that the PCI controls are imperfect, and that a "pass" on the PCI assessment does not necessarily mean that card holder data are in fact adequately secured. Furthermore, card holder data are just a fraction of most organizations' information assets, hence ticking the PCI compliance box does not necessarily mean the organization has adequate information security as a whole (it is an indicator, perhaps, but primarily concerns compliance with externally-imposed obligations). It would be practically impossible to extend the PCI-type approach to cover all of information security, physical security, risk management and governance, whereas other approaches such as maturity metrics could be used both to measure and to drive improvements.

George ends the report with a plea to collaborate with other metrics professionals. I welcome the initiative and will definitely get in touch! 

03 April 2014

What are KPIs?

Krag and I have discussed this question from time to time and, although we are broadly aligned in our thinking, we haven't yet totally resolved our differences ... which makes the exchanges fun.

With that in mind, I always wonder what someone really means when they talk about KPIs. To some, Key Performance Indicator has a very specific and particular meaning, although I suspect if we assembled a dozen such people in a room to discuss it, we'd soon end up realizing that we have more than a dozen different interpretations! 

To others (including me, as it happens), KPI is a generic, blanket term for a class or type of metric that satisfies the criteria implied by the term: 
  • Key implies that the metric itself is especially important, crucial or vital even, given that there are many many different ways to measure and assess things but most of them are of limited value. Picking out the few things that truly matter is a core issue in metrics. 'Spam volume' is an example of a metric that is both narrow and shallow, whereas 'email risk level' is a much broader, deeper and richer metric, and is far more likely to be considered key (even if we happen to be talking specifically about metrics relating to spam filtering). However, the criticality and value of metrics does depend on the contexts or situations being measured and the perspectives and information needs of their users. It is conceivable that 'spam volume' may be considered a KPI for the anti-spam controls, but that's a narrow perspective. Key may also refer to the performance, in the sense that the KPI is an indicator concerning an important issue; 
  • Performance is a distinctly ambiguous word, implying concern for the process and/or its outcome. Are we measuring key activities (typically in order to assess and improve the efficiency of the process) or its outcomes (typically to assess and improve the effectiveness of the process), neither or both? I have seen KPI used in several different senses, although usually it is not totally clear (perhaps not even to the person discussing it!) which one is meant; 
  • Indicator generally means simply a metric or measure but it may imply imprecision or approximation. A typical car's fuel gage, for instance, gives a fairly vague indication of the amount of fuel in the tank, whereas the equivalent metric on an aircraft tends to be much more precise, accurate and reliable, for obvious reasons. The car's fuel gage may not tell you how many litres remain but if it heads into the red zone, you know you need to find a filling station soon. Indicator often also implies a forward-looking or predictive rather than purely historical measure. 'Trends' are common examples, used to manage various aspects of information security where precision is nice but not vital (e.g. supporting security investment or resourcing decisions). 
As far as I'm concerned, then, a KPI is generally a predictive metric concerning some critical outcome of an important process. In the information security context, KPIs are most likely to measure core security processes such as risk assessment, and key controls such as authentication and access control. Efficiency is secondary to effectiveness, in that security failures resulting from ineffective controls can lead to serious and potentially very damaging incidents, whereas inefficient controls are merely a bit wasteful (that's actually a strong bias, one worth challenging in situations where security becomes onerous for information users and administrators, perhaps so onerous that they bypass or disable the controls sending us back to square one!).

21 March 2014

Avoiding metrics myopia

Things being measured are patently being observed in a very specific, focused manner. Things that are observed so closely tend to be presumed by others to be of concern to the measurer/observer, at least. Things under the measurement ruler therefore assume a certain significance simply by dint of their being closely observed, regardless of any other factors.

We see this 'fascination bias' being actively exploited by cynical promoters, marketers and advertisers on a daily basis through an endless stream of largely banal and unscientific online polls and infographics, their true purpose made all the more obvious by the use of bright primary-color eye-catching graphics. They are manipulating readers to believe that since something has been measured, it must be important. How many of us take the trouble to think about the quality of the metrics, or about all the other aspects that haven't been measured? Like bunnies in the headlights, we stare at the numbers.

In PRAGMATIC Security Metrics, we outlined 21 observer biases drawn from an even longer list drawn up by Kahneman, Slovic and Tversky in Judgment Under Uncertainty (1982): what I'm calling 'fascination bias' has some resemblance to what Kahneman et al. described as 'attentional bias', the tendency to neglect relevant data when making judgments of a correlation or association.

Fascination bias creates a genuine concern in that we tend to measure things that are relatively easy to measure, and place undue faith in those metrics relative to other factors that are not being measured. Back in 2011, Michel Zalewski said in his blog:
"Using metrics as long-term performance indicators is a very dangerous path: they do not really tell you how secure you are, because we have absolutely no clue how to compute that. Instead, by focusing on hundreds of trivial and often irrelevant data points, they take your eyes off the new and the unknown."
While we don't entirely accept that we 'have no clue how to compute security performance', his point about neglecting other risks and challenges due to being inordinately focused on specific metrics is sound.  It's only natural that what gets measured gets addressed (though not necessarily improved!). The unfortunate corollary is that what doesn't get measured gets neglected.

The upshot of this is that there is a subtle obligation on those who choose metrics to find ways to measure all the important matters, even if some of those metrics are expensive/complex/qualitative/whatever. It's simply not good enough to measure the easy stuff, such as the numbers that assorted security systems constantly pump out 'for free'. It's inappropriate to disregard harder-to-measure issues such as culture, ethics, awareness and trust, just as it is inappropriate to restrict your metrics to IT or cybersecurity rather than information security.

That's one of the key reasons why we favor the systematic top-down GQM approach: if you start by figuring out the Goals or objectives of security, expand on the obvious Questions that arise and only then pick out a bunch of potential Metrics to answer those questions, it's much harder to overlook important factors. As to figuring out the goals or objectives, conceptual frameworks for information security such as BMIS and ISO27k, based on fundamental principles, are an obvious way to kick-off the thinking process and frame the initial discussion.

13 March 2014

ISO27k Toolkit

On the toolkit theme, I have just updated the FREE ISO27k Toolkit over at ISO27001security.com with an Excel workbook used to track progress on implementing the ISO/IEC 27001 and 27002 standards.

Thanks mostly to Ed Hodgson, the gap analysis/SoA workbook in the ISO27k Toolkit has been updated for the 2013 releases of the standards.

The new version has two main spreadsheets:
  1. The first sheet is used to check and track progress towards implementing an ISMS complying with all the mandatory front parts of ISO/IEC 27001:2013 - mandatory, that is, if you intend to get your ISMS certified.  I made a few little wording changes and editorial decisions in this section, so if you use this for certification purposes, please double-check against the requirements formally specified in the standard and don't rely entirely on the spreadsheet!  The spreadsheet is not definitive.  The standard rules.
  2. The second sheet covers the discretionary parts, namely the controls listed briefly in Annex A of '27001 and explained in more depth in ISO/IEC 27002:2013 plus any controls that you add or change on the list, for example additional legal, regulatory or contractual obligations, or ISO 22301, NIST SP800s or whatever.  Don't be afraid to adapt the list of controls!  '27001 Annex A and '27002 are intended to be 'reasonably comprehensive' starting points, laying out a decent set of good security practices, but your information security risks and hence control requirements are unique to you. 

For both parts, you simply select the relevant colour-coded status indicator from a drop-down list on each item, and record brief notes to explain the situation.  The status levels are adapted from Carnegie Mellon's Capability Maturity Model, showing progress from not-implemented-at-all (bright red) up to fully-implemented-working-and-auditable (dark green), plus grey options for "? unknown" (i.e. status not yet checked) and "Not applicable".

A third metrics spreadsheet simply counts the number of items at each status level in each of the two main sheets, and draws a pair of pretty pie charts showing the proportions ... 

These very simple metrics clearly indicate progress towards a compliant, working, provable ISMS managing a reasonably comprehensive suite of information security controls, as both pies gradually go dark green.  [Pies going green is usually a bad sign, but in this particular case dark green pies are tasty  :-)]   Actually, in Excel, it is easy to generate whatever format charts you like, line graphs showing the trends month-by-month for instance. That's left as an exercise for the reader.

As with all the ISO27k Toolkit items, it is provided free under a Creative Commons license that allows you to use and adapt it as much as you want for your own purposes, and to share it under the same terms (but not to sell it!). 

I take full responsibility for any errors in the spreadsheets.  If/when you find any errors, please let me know.  My Excel skills are rudimentary: I'm sure an Excel wizard would be able to come up with a sexier version, with better error-checking, better usability, more easily extendable, and perhaps more functional.  If you do improve it, please share it back for the benefit of the whole user community.