Setting the Stage
The final version of the training ended up encompassing over 850 slides with many hands-on activities. The Cyber Defense Vulnerability Insight Laboratory (Cyber DeVIL) was set up to accommodate groups of students with each student having a computer with virtual machines, network connections to the attack server, the hands-on activities and follow-along steps so no one would fall behind. Two pilots of the classes were advertised across two different geographical locations to maximize the variety of developer knowledge bases. The goal of the pilot classes was not only to train developers but also to find out what developers already knew about secure development, what would need to be added or removed from the training to make it more viable across the Navy, and most importantly, to find out if it was interesting. If the training wasn’t interesting, the software assurance game would be lost before the movement even started. The pilot classes received more candidate requests than there were seats to accommodate people, so the selection was based upon two criteria. The first criterion was experience with software development. Software developers were chosen with a range of experience from recently out-of-school to seasoned professionals. This would provide a sense for what skills were being taught in universities as well as what had been learned during a significant career length. The second criterion was prior knowledge of software assurance. Developers were chosen with a range of software assurance knowledge from none to some. This would help know how well the training compared to other information as well as providing feedback about the style of training. Additionally, it would validate the assumption that most developers were unaware of software assurance as well as observe how well they responded to the topic and the extra work that this effort was going to demand. All three topics were created with open-source information and provided those resources to the attendees for further use. The final day included a guest speaker who shared his substantial experience as a software assurance tester and what he had seen work to greatly improve security in government software.
KNOW YOUR ENEMY – HACKER 101
The purpose of the Hacker 101 class was twofold. First, the class was meant to illustrate the mindset of the hacker and what they could do with weaknesses in code. It was important for students to understand that functionally correct code can provide an attack pathway into the overarching system if it has even minor security oversights. Second, the class allowed the developers to play at being a hacker. This concept piqued their interest, got them to sign up for the class, and provided a fun approach to a new critical topic. It also gave them the impetus to take ownership to find and fix the weaknesses that could be in their code.
The introduction to the class quoted experts stating that security weaknesses in code are rampant and that software security is not understood as an essential priority alongside functionality. The topic of Software Assurance as a component of Software Engineering was introduced. Also discussed were the National Defense Authorization Acts, which Congress had mandated to direct the Department of Defense to perform software assurance to better secure our military systems. The unclassified open-source Mandiant report4 was noted as an example of real-world attack activities.
The class covered the different phases of an attack: Reconnaissance, Network Scanning, Exploitation, Post-exploitation, Maintaining Access and Covering Tracks. Several demonstrations were given to show how an entire attack would look across the phases for a more in-depth look into an activity which would require a higher skill level and extended time to fully complete. The students used Kali Linux, Metasploit and the command line for tools such as Nmap across the phases to get hands-on experience. An overview of each hands-on activity was presented at the end of each topic discussion (ex. Figure 1) followed by detailed steps.
Figure 1: Example of Student Hands-On Activity