Defense Technical Information Center’s (DTIC’s) Hidden Gems

The Defense Technical Information Center (DTIC) provides a host of products and services to the DoD and to users in government, industry and academia.

One of the important facets of their services is access to a huge trove of scientific and technical information (STI) covering close to seven decades of military research and development (R&D). This article, and succeeding articles in future CSIAC Journals, covers one small area of STI and its roots in 20th century military R&D by identifying early documents that addressed new concepts and ideas. These “hidden gems” are interesting and relevant for two reasons – first, it highlights the early investments that the DoD has made in virtually every aspect of science and technology; and second, it provides a glimpse into the underlying fundamentals which we are still researching today, including ideas and concepts that are surprisingly cutting edge and “ancient” (in technology timelines) at the same time. Since this is a special edition of the CSIAC Journal focusing on Software Assurance, here are a few hidden gems from the DTIC treasure chest that might warrant a closer look from both the curious and the serious researcher. Who knows, it might lead you in a new and different direction. To quote Winston Churchill, the farther backward you can look, the farther forward you can see.

Inherent in providing Software Assurance to a community is a testing methodology that provides a set of guarantees. Static Testing is an approach to looking through code and algorithms to find out what should happen, and checking out what it should do in as formal a way possible without executing the code explicitly. Some of the first concepts in this area came from military research. Our first document is from 1976, 41 years ago, and is titled “Protection Errors in Operating Systems: Validation of Critical Conditions”, available at This document recognizes and provides methodologies for reasoning about validation of complex software. There is also a very good description on page 6 (page 14 from the cover, pages aren’t numbered) of a basic principle for validation of operating system kernel operations at the time of invocation. It identifies the possibility of incorrect operation when the timing of invocation is not consistent with the state of the entities being acted on; something that much later came to be called Time of Check/Time of Use (TOCTOU) errors or attacks.

Just to show the inclusion of concepts in early DoD guidance that we still find ourselves grappling with or discovering anew, the following two documents from 1972 and 1988 apply almost as well now as they tried to apply then. The second document we highlight is a DoD Manual from 1972, the ADP Security Manual DoD 5200.28-M (available from DTIC at ). Need a definition of risk management applied to military computer systems? Think risk management is a 21st century concept? Look on page 2, where the following information is provided:

The potential means by which a computer system can be adequately secured are virtually unlimited. The safeguards adopted must be consistent with available technology, the frequency of processing, the classification of the data handled or the information to be produced, the environment in which the ADP System operates, the degree of risk which can be tolerated, and other factors which may be unique to the installation involved… … it is understood that all of the techniques described in this manual may not be economically justified after a cost versus risk evaluation. Therefore, selected subsets of the techniques included in this manual, with appropriate trade-offs, may be used to gain the level of security required for classification category, etc., to be secured.

Not bad for 45 years ago. There are additional ideas contained in this document and many others that highlight early perspectives on risk, malware, network vulnerabilities, etc. In the 1960’s and 1970’s, many of the ideas were not practically or technologically implemented, but the germinal ideas and coherent thinking about what was to come in computers, software and networks was well developed. Look at page 23 and 24 regarding the protections that should ensue from the operating system itself (although, at this point in time, the difference between operating system aspects and hardware-specific aspects were a bit different…)

  • The execution state of a processor should include one or more variables, i.e., “protection state variables,” which determine the interpretation of instructions executed by the processor. For example, a processor might have a master mode/user mode protection state variable, in which certain instructions are illegal except in master mode. Modification of the protection state variables shall be so constrained by the operating system and hardware that a user cannot access information for which he has no authorization.
  • The ability of a processor to access locations in memory (hereinafter to include primary and auxiliary memory) should be controlled (e.g., in user mode, a memory access control register might allow access only to memory locations allocated to the user by the O/S).
  • All possible operation codes, with all possible tags or modifiers, whether legal or not, should produce known responses by the computer.
  • Error detection should be performed on each fetch cycle of an instruction and its operant (e. g., parity check and address bounds check).

Where would buffer overflows be if the last one had been integrated more completely over the last 45 years?

If we chase this document lineage forward to 1988, to DoD Directive 5200.18 from 1988, we can see the evolution of awareness for security across computer systems and networks. The document is now titled “Security Requirements for Automated Information Systems (AIS)”, and it applies across classified, sensitive, and unclassified information. Also note the change from ADP to AIS. The earlier document was geared specifically for classified systems, thought at the time to be the only computer systems needing enhanced protections. This third document can be found at It reflects the more advanced state of computers and networks at the time, and includes more guidance on risk management, accreditation, information sensitivity, etc.

Of course, it is easy in hindsight to pick the best parts and identify where they could have been helpful over time; that is not the intent here. We’d like to identify documents that convey foundational thought and concepts that help us place “where we are” in a stronger context, and point to ideas that are consistent across generations. It is amazing what you can find when you look.

Want to find out more about this topic?

Request a FREE Technical Inquiry!