10 July 2014

Pragmatic use for a PRAGMATIC security metric

I have just published a tool developed by Ed Hodgson, Marty Carter and me to help people estimate how long their ISO/IEC 27001 ISMS implementation projects will take.

The tool is an Excel spreadsheet (DOWNLOAD). As with the remainder of the ISO27k Toolkit, it is free to use and is covered by a Creative Commons license. I will roll it into the Toolkit when the Toolkit is next updated.

The estimated project timescale depends on how you score your organization against a set of criteria - things such as the extent to which management supports the ISMS project, and its strategic fit. The scoring process uses a percentage scale with textual descriptions at four points on the scale, similar to those Krag and I described in PRAGMATIC Security Metrics. The criteria are weighted, since some are way more important than others. The scores you enter either increase or decrease the estimated timescale from a default value, using a model coded into the spreadsheets.

Ed enhanced my original model with a more sophisticated method of calculation: Ed’s version substantially extends the timescale if you score low against any criteria, emphasizing the adverse impact of issues such as limited management support and strategic fit. I have left both versions of the model in the file so you can try them both and compare them to see which works best for you … and of course you can play with the models, the criteria and the weightings as well as the scores. I suspect that Ed’s version is more accurate than mine, but maybe both are way off-base. Perhaps we have neglected some factor that you found critical?  Perhaps the weightings or the default timescale are wrong?  If you have successfully completed ISMS implementation projects, please take a look at the criteria and the models, and maybe push your numbers through to see how accurate the estimations would have been.

Feedback comments are very welcome – improvement suggestions especially – preferably on the ISO27k Forum for the benefit of the whole community, otherwise directly to me if you’re shy.

I’m afraid we haven’t yet managed to figure out how to estimate the resourcing (man-days) needed for the implementation project, as we originally planned. A couple of approaches have been suggested (such as breaking down the requirements in ISO/IEC 27001 to identify the activities and competences/skills needed) but it will take more effort to turn the suggestions into a practical tool. If you are inspired to have a go at developing a suitable tool, please make a start and I can set up another collaborative project on Google Docs to continue the development. Further general suggestions are fine but we really need something more concrete to sink our teeth into – a draft or skeleton resourcing estimator would be good. How would you go about it?


Regards,
Gary Hinson  (Gary@isect.com)  

07 July 2014

ASIS report on security metrics

"Persuading Senior Management with Effective, Evaluated Security Metrics" is a lengthy new research report from ASIS Foundation, a membership body for (primarily) physical security professionals.

Quoting from the report's executive summary:
"Security metrics support the value proposition of an organization’s security operation. Without compelling metrics, security professionals and their budgets continue largely on the intuition of company leadership. With metrics, the security function grounds itself on measurable results that correlate with investment, and the security professional can speak to leadership in a familiar business language."
Fair enough.  That's similar to what we wrote in PRAGMATIC Security Metrics, in referring to measurable results and business orientation.
"Security metrics are vital, but in the field and in the literature one finds few tested metrics and little guidance on using metrics effectively to inform and persuade senior management." 
I'm not sure I agree that there are 'few tested security metrics' - it all depends on what we mean by 'tested' and 'security metrics'. I know there are hundreds of information security metrics in use, scores of which I believe are relatively widespread. I also know how easy it is to specify literally thousands of potential information security metrics covering various aspects and facets of our field, especially if one considers variants of any given metric as distinct metrics (e.g. 'the malware threat' could be measured in dozens of different ways, and each of those measures could be expressed or reported in dozens of ways, implying several gross of malware threat metrics).

We described and evaluated about 150 information security metrics examples in PRAGMATIC Security Metrics, and mentioned or hinted at numerous variants that might address some of the shortcomings we found in the examples.
"To address the gap, in spring 2013 the ASIS Foundation sponsored a major research project designed to add to the body of knowledge about security metrics and to empower security professionals to better assess and present metrics. The Foundation awarded a grant to Global Skills X-change (GSX), partnered with Ohlhausen Research, to carry out the project." 
GSX, tagline "Define. Measure. Optimize.", describes itself as "... a professional services firm that specializes in designing workforce education strategies and processes, which allow customers to meet their specific performance goals. The GSX core business model revolves around defining functional competency models and developing valid and reliable assessment tools as the foundation of credentialing and educational programs."

As to Olhausen Research, "A researcher in the security field for more than 25 years, [Peter Ohlhausen, President of Olhausen Research Inc.] has assisted in the multi-year revision of Protection of Assets, served as senior editor of Security Management magazine, and conducted numerous research and consulting projects for the U.S. Department of Justice, U.S. Department of Homeland Security, ASIS, and corporate clients." 
"This report provides the project’s findings, including its three practical, actionable products:
  • The Security Metrics Evaluation Tool (Security MET), which security professionals can self-administer to develop, evaluate, and improve security metrics 
  • A library of metric descriptions, each evaluated according to the Security MET criteria 
  • Guidelines for effective use of security metrics to inform and persuade senior management, with an emphasis on organizational risk and return on investment"
Security MET turns out to be a method for assessing and scoring metrics according to 9 criteria in 3 categories, described in some detail in Appendix A of the ASIS report:
Technical Criteria – Category 1
1. Reliability
2. Validity
3. Generalizability
Operational (Security) Criteria – Category 2
4. Cost 
5. Timeliness
6. Manipulation
Strategic (Corporate) Criteria – Category 3
7. Return on Investment
8. Organizational Relevance
9. Communication
I see interesting parallels to the 9 PRAGMATIC criteria we have described. Security MET requires users to score metrics on a 1-5 scale against each criterion using a 3 point scoring scale rather than the percentage scales with 4 scoring points that we prefer, but is otherwise the same process. It appears that we have converged on a generalized method for assessing or evaluating metrics. However, PRAGMATIC Security Metrics, and indeed other metrics books, are notably lacking from their "comprehensive review of the current state of metric development and application".  Information security is barely even mentioned in the entire report, an unfortunate omission given the convergence of our fields.

The ASIS report goes on to list just 16 [physical] security metrics, described as "authentic examples" that had been identified through the researchers' telephone interviews with respondents:
1. Office Space Usage Metric
2. Security Activity Metric
3. Environmental Risk Metric
4. Averted External Loss Metric
5. Security Audit Metric
6. Officer Performance Metric Panel
7. Security-Safety Metric
8. Security Incidents Metric
9. Personnel Security Clearance Processing Metric
10. Loss Reduction/Security Cost Metric
11. Operations Downtime Reduction Metric
12. Due Diligence Metric
13. Shortage/Shrinkage Metric
14. Phone Theft Metric
15. Security Inspection Findings Metric
16. Infringing Website Compliance Metric
These metrics have each been MET-scored by a handful of people, potentially generating statistics concerning consistency or reliability of the scoring method although the statistics are not actually provided in the report. There is no information reported concerning the method's consistency if the assessments are repeated by the same assessors. The authors do however mention that "The total score may suggest how close the metric is to attaining the highest possible score (45), but it is not likely to be useful for comparing different metrics, as the scoring would be different for users in different organizations.", suggesting that they are unhappy with the method's consistency between different assessors.  

Overall, this report is a worthwhile contribution to the security metrics literature.  Take a look!