With the evolution of modern technology, many organizations are now embracing big data solutions to drive decision-making by uncovering patterns, trends, and correlations in massive amount of raw data. Subsequently, the adoption of big data analytic technologies presents multi-faceted challenges to manage voluminous data while mitigating data security risks.
Computer scientists and engineers across DoD laboratories are finding different means to collect, synthesize, process, and compare data in order to make the most of scientific observations. The capabilities of grid computing to connect large-scale computers to share resources is generating a surplus of unstructured data to analyze. Big data and high-performance computing (HPC) are hot button subjects amongst academic, industrial, and government organizations. Scientists and engineers believe that high performance computing resources can significantly advance scientific research and discovery.
In 2015, the White House published Executive Order 13702 to create a National Strategic Computing Initiative (NSCI). The NSCI was established to promote U.S. leadership in High Performance Computing (HPC) to maximize benefits of high-performance computing (HPC) research, advancement of economic competitiveness, development, and scientific discovery. Most recently, U.S. computing leaders, including Department of Energy Laboratories, have partnered with government, universities, and private sector to launch the COVID-19 High Performance Computing Consortium. The consortium will allow researchers worldwide to access to the world’s most powerful HPC resources in support of COVID-19 research.
The primary objective of HPC systems is to ensure the most resourceful execution of large-scale data analytics, which dictates lightweight security measures with the intention of reducing the overhead coupled with security requirements. Cybersecurity for HPC is a critical mission aspect that presents unique challenges in providing non-repudiation, thus providing a high level of data protection and confidentiality for scientific observations. In this special report, we delineate methods for closing HPC Security Gaps by using Berkeley Packet Filter (BPF) as part of a network load balancer. Berkeley Packet Filter (BFP) was designed in the 1990s as a virtual machine for efficient packet filters. This report will discuss how BFP is used for monitoring, debugging, and collecting statistics from the kernel. This special podcast is geared toward developers and users who want to understand HPC and BPF broader functionality as part of the Kernel Runtime Security to assist with improving detection of security threats.