Keys to Successful DoD Software Project Execution

thisisengineering-raeng-hoivM01c-vg-unsplash

Posted: July 13, 2017 | By: Joe Heil

KEY: Government In-House Applied Software Expertise

Many studies document that program offices frequently do not have the software experience, skills, training, or expertise required to successfully execute. Although software has evolved into one of the most significant, complex, and critical elements of DoD systems, a common acquisition approach is to treat the software components as “black boxes”; with the detailed understanding of the software (and ownership rights) left almost entirely in the hands of private industry. Government in-house software engineer participation (if any) is typically limited to the reactive (versus proactive) responsibility of reviewing industry developed artifacts and supporting milestone reviews. This over-reliance on private industry can result in costly, non-modular, proprietary system architectures, protracted schedules, and poor performance. Sustainment of these systems is very expensive as the government is “locked into” the original industry software development organization and does not have the leverage (technical knowledge and ownership of the software) required to negotiate better cost and performance. This over reliance on industry has also reduced the ability to maintain an in-house government applied software expertise pipeline, leading to a dearth of program leadership that fully understands software development best-practices. As documented in the 2008 Mr. Donald Winter SECDEF memo: “This combination of personnel reductions and reduced RDT&E has seriously eroded the Department’s domain knowledge and produced an over-reliance on contractors to perform core in-house technical functions. This environment has led to outsourcing the ‘hands-on’ work that is needed in-house, to acquire the Nation’s best science and engineering talent and to equip them to meet the challenges of the future Navy. In order to acquire DoN Platforms and weapons systems in a responsible manner, it is imperative the DoN maintain applied technical domain expertise at all levels of the acquisition infrastructure.”

A proven successful alternative software system acquisition, development, and sustainment approach utilizes government in-house software engineers teaming with industry software engineers. Government software engineers do not just monitor/review industry software efforts, but rather they are also responsible for the hands-on architecting, designing, coding, integrating, and testing of a subset of the mission critical complex software components. Government organic software experts are involved both in the original software component development effort for the system (i.e. pre Initial-Operational-Capability (IOC)) and throughout the software sustainment efforts (i.e. post IOC capability upgrades, enhancements, and defect corrections). The percentage of software work allocated between government and industry software organizations will vary between programs based on multiple factors such as size, complexity, and system maturity. In the example programs that utilize this approach listed below, the percentage of government in-house versus industry software developers varies significantly.

This software development teaming approach has been successfully utilized for over 50 years by the Naval Warfare Centers for a wide range of systems (e.g. missiles, guns, directed-energy, lethal and non-lethal detect-track-engage systems) and for a wide range of development approaches (e.g. Waterfall, Incremental, Agile, Rapid Prototyping). Specific programs include the: Strategic Systems Submarine Launched Ballistic Missile (SLBM) Fire Control System (FCS) and Mission Planning System (MPS), Tactical Tomahawk Weapon Control System (TTWCS), the Precision Guided Munition (PGM) and Gun Battle Management System (BMS), Laser Weapon Systems components, and several ground vehicle Detect-Track-Engage systems. These programs have all utilized government and industry software teaming and data-driven best practices to consistently deliver high quality, safe, reliable, modular, scalable, maintainable, reusable, and operationally proven software systems developed within cost and schedule constraints.

By assigning actual software development responsibility to in-house engineers, the Government maintains a software expertise pipe-line as shown in the figure below, and thereby maintains the applied hands-on software expertise required to perform as technical peer level team-mates with private industry software engineers.

Maintaining the government in-house software expertise pipeline provides DoD Program Leaders with access to in-house software experts required to successfully:

  • Assess industry approaches, processes, and effort estimates.
  • Offer alternative non-profit-focused technical approaches.
  • Mitigate the risk of program office personnel turnover.
  • Apply lessons learned and metrics for continuous improvement.
  • Be assigned emergent tasks for technology investigation, rapid prototyping, or other technical tasks without costly contract modifications.
  • Control industry cost as the software development tasks can be easily transferred to the government team if industry cost growth becomes too great or for poor technical performance (and vice versa). This leverage works best when the government software development organization(s) have been involved from the initial system development efforts and throughout the sustainment phases.

KEY: Applying Lessons Learned

The majority of challenges and best-practices addressed in this paper have been previously reported in DoD software system acquisition and engineering assessment reports (e.g. Defense Science Board (DSB), Government Accounting Organization (GAO), and Software Engineering Institute (SEI)). However, there are software intensive system programs that continue to repeat the mistakes of the past.

Although the majority of programs conduct formal system engineering technical reviews (requirements reviews, design reviews, delivery readiness reviews, etc.), these programs do not collect project execution metrics and conduct periodic formal software process improvement events where the planned-versus-actual cost, schedule, technical performance, quality, assurance, and risk metrics are analyzed and used to identify specific process improvement actions that are then assigned and tracked to closure.

Program offices lack the leaders and staff with applied software development experience, expertise; training; or awareness of the findings and recommendations from the many software assessment reports required to fully appreciate and adequately resource (funding and schedule) best-practice based software engineering and project control. The significant pressure to reduce cost and schedule drives program managers into “short-term” thinking and decision making which frequently results in the long-run of driving total ownership cost up, significant schedule delays, poor quality, and results in non-maintainable, non-scalable, and non-multi-system-platform architected systems.

Want to find out more about this topic?

Request a FREE Technical Inquiry!