Automation and Ongoing Authorization Transition/Implementation

Home / Articles / Exclusive Web

The use of manual methods to monitor system controls has essentially become impractical due to the growing number of applicable controls and the increasing frequency at which they are to be evaluated (for the RMF’s near real-time risk assessment). Instead, the NIST guidance strongly suggests that automation be used, to the extent possible, to generate, correlate, analyze and report security-related information. Automated tools are identified as a means to facilitate the timely and efficient delivery of information allowing organizational officials to make informed, risk-based decisions regarding the risks to their information systems.

While the advantages of automation are well-documented, instructions regarding the implementation of automated tools are not as prevalent; especially for Ongoing Assessment (OA). Such instructions are difficult to provide, based on a number of factors specific to each information system and the implemented controls, while also avoiding endorsements of specific products. Nonetheless, some instruction is still needed for those organizations attempting to use automated tools. Section 2.3 of NIST SP 800-137 identifies the role of automation in ISCM, but likely the most specific guidance, in terms of an example of the application of automated tools, can be found in Appendix I of NIST SP 800-53A Rev. 4 “Ongoing Assessment and Automation”.[1] This helpful (but brief) discussion describes the use of a “desired state specification” with specific metrics that can provide a point of comparison against the actual state. In essence, this baseline can be compared to the information collected during an automated assessment to identify anomalies that may indicate a change in the effectiveness of one or more security controls. This general approach provides some insight into the use of automation, but does not necessarily simplify this automated process for practitioners.

RMF’s ongoing component is closely related to steps 3 and 4 of the ISCM process, which involves the implementation of the ISCM program through the deployment of necessary sensors and tools to collect security-related information and metrics, the analysis of the collected data, and the reporting of the findings to the organization’s leadership. It can be difficult to define the granularity of the information that is to be collected, as too much information can overwhelm security analysts while not enough information can allow certain events to go unnoticed. The determination typically occurs during steps 1 and 2 of the ISCM process where the organization collectively identifies the types and quantities of information to be collected. Recognizing that information systems are becoming increasingly complex and interconnected, automated tools have become somewhat of a necessity as manual techniques simply cannot accommodate such large datasets, let alone the analysis. As the discussion on the use of automation in NIST SP 800-53A Rev. 4 is fairly brief, NIST has been developing supplemental guidance in the form of a NIST Interagency Report (IR).

NIST IR 8011 Vol. 1 (Draft)[2] describes the requirements for the utilization of automated security control assessments while also defining the process to be followed and the general strategies associated with such practices. It specifically references the publication approving the OA process[3], and the government’s later realization of the challenges associated satisfying FISMA’s cybersecurity reporting requirements. To alleviate the identified challenges, DHS was funded to develop the Continuous Diagnostics and Mitigation (CDM) program in 2012. Prior to discussing the CDM program, it is important to first define the outlined process for automated security control monitoring.

Consistent with the OA strategy, organizations must satisfy a collection of requirements before they are considered qualified to move forward with an automated control monitoring; specifically, the information system in question is to have received an initial ATO under the RMF A&A process. The actual automation of assessment techniques varies depending on the security controls that have been implemented, based on steps 1, 2 and 3 of the RMF process (i.e., categorize information system, select security controls and implement security controls). While the OA strategy is performed as step 6 (monitor security controls) of the RMF, it more specifically involves the ongoing performance of steps 4 and 5 (assess security controls and authorize information system). Understanding that these repeated activities can become overly burdensome if not infeasible for complex systems drives the motivation for automated assessment procedures.

Directing our focus back the security control assessments, NIST SP 800-53A Rev. 4 identifies 3 general assessment methods:

  • Examine
  • Interview
  • Test

Each method is appropriate for specific types of security controls, but the test method is generally considered to be the least subjective and most effective.[4] Furthermore, the test assessment method is most closely related to the use of automated tools, based on the other assessment method’s reliance on human contributions. The determination as to which methods should be implemented is generally made during the organization’s development of its Continuous Monitoring (CM) strategy as part of the ISCM plan.

To define the role of automation, it is important to first consider the ongoing assessment process, which is best represented by Figure 17.

Figure 17: ISCM Ongoing Assessment Process[5]

The process flow represents the logical sequence of activities whereby the system and the current state of its implemented controls are evaluated against the “defined desired state”, which can be associated with the tailored security control baseline (per the defined categorization) and any applicable control overlays. Automated tools can be extremely useful in the collection and analysis of information to perform the current state vs. desired state comparison. The identified differences can be prioritized, based on the severity of the potential impact, and often represented in an ISCM dashboard to provide a holistic security perspective. Each of these capabilities simplifies the OA process and supports the RMF’s near real-time risk assessment strategy.

The use of automated tools and processes provides several advantages. However, administrators sometimes implement such tools too hastily without adequate testing and verification. NIST IR 8011 (Draft) identifies a series of preparatory activities to be considered prior to a full scale deployment, including consideration of when/how these systems can be trusted. Furthermore, it is critical that the security teams operating and monitoring these automated tools are properly trained and fully understand each tool’s purpose and functionality. There are several disadvantages to a black box approach where administrators do not understand how tools operate and blindly trust the reported data (potentially providing a false sense of security). Similarly, it is important that the tools’ implementation remains consistent with the security capability abstraction layers, which consist of the following:[6]

  • Attack Step Layer: delaying or preventing attacks
  • Functional Capability Layer[7]: security control groupings based on specific security objectives
  • Sub-Capability Layer: goals within the larger security objectives and the related tests to identify shortfalls in meeting a particular security goal
  • Control Items Layer – the actual security control requirements identified NIST SP 800-53 Rev. 4 (Draft 5)

This structured approach helps to ensure the effectiveness of the automated security control assessment measures while remaining consistent with NIST’s cybersecurity strategies.

NIST IR goes on to provide a broader description of each stage of the process, including the actual vs. desired state comparison, the defect checks associated with the Sub-Capability Layer of abstraction, the documentation of assessment plans, performing root cause analysis for identified defects, the roles and responsibilities for this process, and the specific ties between the ISCM and automated control assessment process to step 4 of the NIST RMF. A complete description of this process is beyond the scope of this document, but the brief discussion of the associated process clearly aligns with the general goals and objectives of the RMF.

Returning to the discussion of DHS’s CDM program, the intent is to support government organizations with the tools and processes to actively monitor information systems in near real-time to identify and mitigate cybersecurity risks. The program provides Commercial-off-the-shelf (COTS) tools that include network and system sensors, analytics engines and visualization dashboards to support efficient and effective monitoring practices. The process is best illustrated by the diagram presented below.

Figure 18: CDM Monitoring, Alert, Mitigation and Reporting Process[8]

This process falls within the larger RMF process and is consistent with the ongoing strategy that involves the identification and prioritization of defects, as well as the subsequent corrective actions and the verification and reporting of the performed activities. DHS provides additional guidance related to government organizations’ acquisition of the COTS solutions as well as instructions on the implementation of the CDM program.

While the overarching strategy for ISCM programs and the related automated security control assessments is outlined in the NIST IR 8011 Vol. 1 (Draft), additional volumes are in the development stages for the identified security capabilities. The planned collection of volumes includes the following:[9]

  • Volume 1 Overview
  • Volume 2 Hardware Asset Management (HWAM)
  • Volume 3 Software Asset Management (SWAM)
  • Volume 4 Configuration Settings Management
  • Volume 5 Vulnerability Management
  • Volume 6 Boundary Management (Physical, Filters, and Other Boundaries)
  • Volume 7 Trust Management
  • Volume 8 Security-Related Behavior Management
  • Volume 9 Credentials and Authentication Management
  • Volume 10 Privilege and Account Management
  • Volume 11 Event (Incident and Contingency) Preparation Management
  • Volume 12 Anomalous Event Detection Management
  • Volume 13 Anomalous Event Response and Recovery Management

[1] NIST SP 800-53A Rev. 4, “Assessing Security and Privacy Controls in Federal Information Systems and Organizations: Building Effective Assessment Plans”, Dec 2014

[2] NIST IR 8011 Vol. 1 (Draft), “Automation Support for Security Control Assessments: Overview”, Feb 2016. Note: NIST SP 800-53A Rev. 4 identifies the proposed document’s title as “Automation Support for Ongoing Assessment”.

[3] OMB-11-33, “FY 2011 Reporting Instructions for the Federal Information Security Management Act and Agency Privacy Management”, Sep 2011

[4] NIST IR 8011 Vol. 1 (Draft) , “Automation Support for Security Control Assessments: Overview”, Feb 2016

[5] NIST IR 8011 Vol. 1 (Draft) , “Automation Support for Security Control Assessments: Overview”, Feb 2016

[6] NIST IR 8011 Vol. 1 (Draft) , “Automation Support for Security Control Assessments: Overview”, Feb 2016

[7] NIST SP 800-53 Rev. 4 (Draft 5) defines the concept of security capabilities in Section 4, page 21.

[8] “Continuous Diagnostics and Mitigation (CDM)”, DHS: Securing Federal Networks, Web,, Accessed 27 April 2017

[9] NIST IR 8011 Vol. 1 (Draft) , “Automation Support for Security Control Assessments: Overview”, Feb 2016

Want to find out more about this topic?

Request a FREE Technical Inquiry!