The file removal technique deletes malicious artifacts or programs from a computer system.
ID | Name |
---|---|
D3-ER | Email Removal |
ID | Name | Description | NIST Rev5 | D3FEND | ISO 27001 | |
CM0017 | Coding Standard | Define acceptable coding standards to be used by the software developer. The mission should have automated means to evaluate adherence to coding standards. The coding standard should include the acceptable software development language types as well. The language should consider the security requirements, scalability of the application, the complexity of the application, development budget, development time limit, application security, available resources, etc. The coding standard and language choice must ensure proper security constructs are in place. | PL-8 PL-8(1) SA-11 SA-15 SA-3 SA-4(9) SA-8 | D3-AI D3-AVE D3-SWI D3-DCE D3-EHPV D3-ORA D3-FEV D3-FR D3-ER D3-PE D3-PT D3-PS | A.5.8 A.5.2 A.5.8 A.8.25 A.8.31 A.8.27 A.8.28 A.8.29 A.8.30 A.5.8 A.8.25 |
ID | Name | Description | |
---|---|---|---|
IA-0001 | Compromise Supply Chain | Threat actors may manipulate or compromise products or product delivery mechanisms before the customer receives them in order to achieve data or system compromise. | |
IA-0001.02 | Software Supply Chain | Threat actors may manipulate software binaries and applications prior to the customer receiving them in order to achieve data or system compromise. This attack can take place in a number of ways, including manipulation of source code, manipulation of the update and/or distribution mechanism, or replacing compiled versions with a malicious one. | |
IA-0007 | Compromise Ground System | Threat actors may initially compromise the ground system in order to access the target spacecraft. Once compromised, the threat actor can perform a multitude of initial access techniques, including replay, compromising FSW deployment, compromising encryption keys, and compromising authentication schemes. Threat actors may also perform further reconnaissance within the system to enumerate mission networks and gather information related to ground station logical topology, missions ran out of said ground station, birds that are in-band of targeted ground stations, and other mission system capabilities. | |
IA-0007.01 | Compromise On-Orbit Update | Threat actors may manipulate and modify on-orbit updates before they are sent to the target spacecraft. This attack can be done in a number of ways, including manipulation of source code, manipulating environment variables, on-board table/memory values, or replacing compiled versions with a malicious one. | |
EX-0009 | Exploit Code Flaws | Threats actors may identify and exploit flaws or weaknesses within the software running on-board the target spacecraft. These attacks may be extremely targeted and tailored to specific coding errors introduced as a result of poor coding practices or they may target known issues in the commercial software components. | |
EX-0009.01 | Flight Software | Threat actors may abuse known or unknown flight software code flaws in order to further the attack campaign. Some FSW suites contain API functionality for operator interaction. Threat actors may seek to exploit these or abuse a vulnerability/misconfiguration to maliciously execute code or commands. In some cases, these code flaws can perpetuate throughout the victim spacecraft, allowing access to otherwise segmented subsystems. | |
EX-0010 | Malicious Code | Threat actors may rely on other tactics and techniques in order to execute malicious code on the victim spacecraft. This can be done via compromising the supply chain or development environment in some capacity or taking advantage of known commands. However, once malicious code has been uploaded to the victim spacecraft, the threat actor can then trigger the code to run via a specific command or wait for a legitimate user to trigger it accidently. The code itself can do a number of different things to the hosted payload, subsystems, or underlying OS. | |
EX-0010.01 | Ransomware | Threat actors may encrypt spacecraft data to interrupt availability and usability. Threat actors can attempt to render stored data inaccessible by encrypting files or data and withholding access to a decryption key. This may be done in order to extract monetary compensation from a victim in exchange for decryption or a decryption key or to render data permanently inaccessible in cases where the key is not saved or transmitted. | |
EX-0010.02 | Wiper Malware | Threat actors may deploy wiper malware, which is a type of malicious software designed to destroy data or render it unusable. Wiper malware can spread through various means, software vulnerabilities (CWE/CVE), or by exploiting weak or stolen credentials. | |
EX-0010.03 | Rootkit | Rootkits are programs that hide the existence of malware by intercepting/hooking and modifying operating system API calls that supply system information. Rootkits or rootkit enabling functionality may reside at the flight software or kernel level in the operating system or lower, to include a hypervisor, Master Boot Record, or System Firmware. | |
EX-0010.04 | Bootkit | Adversaries may use bootkits to persist on systems and evade detection. Bootkits reside at a layer below the operating system and may make it difficult to perform full remediation unless an organization suspects one was used and can act accordingly. | |
PER-0002 | Backdoor | Threat actors may find and target various backdoors, or inject their own, within the victim spacecraft in the hopes of maintaining their attack. | |
PER-0002.02 | Software | Threat actors may inject code to create their own backdoor to establish persistent access to the spacecraft. This may be done through modification of code throughout the software supply chain or through modification of the software-defined radio configuration (if applicable). |
ID | Description | |
SV-AC-3 |
Compromised master keys or any encryption key |
|
SV-CF-2 |
Eavesdropping (RF and proximity) |
|
SV-IT-2 |
Unauthorized modification or corruption of data |
|
SV-MA-2 |
Heaters and flow valves of the propulsion subsystem are controlled by electric signals so cyberattacks against these signals could cause propellant lines to freeze, lock valves, waste propellant or even put in de-orbit or unstable spinning |
|
SV-AV-4 |
Attacking the scheduling table to affect tasking |
|
SV-IT-5 |
Onboard control procedures (i.e., ATS/RTS) that execute a scripts/sets of commands |
|
SV-MA-3 |
Attacks on critical software subsystems Attitude Determination and Control (AD&C) subsystem determines and controls the orientation of the satellite. Any cyberattack that could disrupt some portion of the control loop - sensor data, computation of control commands, and receipt of the commands would impact operations Telemetry, Tracking and Commanding (TT&C) subsystem provides interface between satellite and ground system. Computations occur within the RF portion of the TT&C subsystem, presenting cyberattack vector Command and Data Handling (C&DH) subsystem is the brains of the satellite. It interfaces with other subsystems, the payload, and the ground. It receives, validate, decodes, and sends commands to other subsystems, and it receives, processes, formats, and routes data for both the ground and onboard computer. C&DH has the most cyber content and is likely the biggest target for cyberattack. Electrical Power Subsystem (EPS) provides, stores, distributes, and controls power on the satellite. An attack on EPS could disrupt, damage, or destroy the satellite. |
|
SV-SP-1 |
Exploitation of software vulnerabilities (bugs); Unsecure code, logic errors, etc. in the FSW. |
|
SV-SP-3 |
Introduction of malicious software such as a virus, worm, Distributed Denial-Of-Service (DDOS) agent, keylogger, rootkit, or Trojan Horse |
|
SV-SP-6 |
Software reuse, COTS dependence, and standardization of onboard systems using building block approach with addition of open-source technology leads to supply chain threat |
|
SV-SP-9 |
On-orbit software updates/upgrades/patches/direct memory writes. If TT&C is compromised or MOC or even the developer's environment, the risk exists to do a variation of a supply chain attack where after it is in orbit you inject malicious code |
|
SV-AC-5 |
Proximity operations (i.e., grappling satellite) |
|
SV-AC-6 |
Three main parts of S/C. CPU, memory, I/O interfaces with parallel and/or serial ports. These are connected via busses (i.e., 1553) and need segregated. Supply chain attack on CPU (FPGA/ASICs), supply chain attack to get malware burned into memory through the development process, and rogue RTs on 1553 bus via hosted payloads are all threats. Security or fault management being disabled by non-mission critical or payload; fault injection or MiTM into the 1553 Bus - China has developed fault injector for 1553 - this could be a hosted payload attack if payload has access to main 1553 bus; One piece of FSW affecting another. Things are not containerized from the OS or FSW perspective; |
|
SV-AC-8 |
Malicious Use of hardware commands - backdoors / critical commands |
|
SV-AV-2 |
Satellites base many operations on timing especially since many operations are automated. Cyberattack to disrupt timing/timers could affect the vehicle (Time Jamming / Time Spoofing) |
|
SV-AV-3 |
Affect the watchdog timer onboard the satellite which could force satellite into some sort of recovery mode/protocol |
|
SV-IT-3 |
Compromise boot memory |
|
SV-IT-4 |
Cause bit flip on memory via single event upsets |
|
SV-MA-8 |
Payload (or other component) is told to constantly sense or emit or run whatever mission it had to the point that it drained the battery constantly / operated in a loop at maximum power until the battery is depleted. |
|
SV-SP-11 |
Software defined radios - SDR is also another computer, networked to other parts of the spacecraft that could be pivoted to by an attacker and infected with malicious code. Once access to an SDR is gained, the attacker could alter what the SDR thinks is correct frequencies and settings to communicate with the ground. |
|
SV-SP-7 |
Software can be broken down into three levels (operating system and drivers’ layer, data handling service layer, and the application layer). Highest impact on system is likely the embedded code at the BIOS, kernel/firmware level. Attacking the on-board operating systems. Since it manages all the programs and applications on the computer, it has a critical role in the overall security of the system. Since threats may occur deliberately or due to human error, malicious programs or persons, or existing system vulnerability mitigations must be deployed to protect the OS. |
|
SV-AV-5 |
Using fault management system against you. Understanding the fault response could be leveraged to get satellite in vulnerable state. Example, safe mode with crypto bypass, orbit correction maneuvers, affecting integrity of TLM to cause action from ground, or some sort of RPO to cause S/C to go into safe mode; |
|
SV-AV-6 |
Complete compromise or corruption of running state |
|
SV-DCO-1 |
Not knowing that you were attacked, or attack was attempted |
|
SV-MA-5 |
Not being able to recover from cyberattack |
|
SV-AC-1 |
Attempting access to an access-controlled system resulting in unauthorized access |
|
SV-AC-2 |
Replay of recorded authentic communications traffic at a later time with the hope that the authorized communications will provide data or some other system reaction |
|
SV-CF-1 |
Tapping of communications links (wireline, RF, network) resulting in loss of confidentiality; Traffic analysis to determine which entities are communicating with each other without being able to read the communicated information |
|
SV-CF-4 |
Adversary monitors for safe-mode indicators such that they know when satellite is in weakened state and then they launch attack |
|
SV-IT-1 |
Communications system spoofing resulting in denial of service and loss of availability and data integrity |
|
SV-AC-7 |
Weak communication protocols. Ones that don't have strong encryption within it |
|
SV-AV-1 |
Communications system jamming resulting in denial of service and loss of availability and data integrity |
|
SV-MA-7 |
Exploit ground system and use to maliciously to interact with the spacecraft |
|
SV-AC-4 |
Masquerading as an authorized entity in order to gain access/Insider Threat |
|
SV-AV-7 |
The TT&C is the lead contributor to satellite failure over the first 10 years on-orbit, around 20% of the time. The failures due to gyro are around 12% between year one and 6 on-orbit and then ramp up starting around year six and overtake the contributions of the TT&C subsystem to satellite failure. Need to ensure equipment is not counterfeit and the supply chain is sound. |
|
SV-CF-3 |
Knowledge of target satellite's cyber-related design details would be crucial to inform potential attacker - so threat is leaking of design data which is often stored Unclass or on contractors’ network |
|
SV-MA-4 |
Not knowing what your crown jewels are and how to protect them now and in the future. |
|
SV-MA-6 |
Not planning for security on SV or designing in security from the beginning |
|
SV-SP-10 |
Compromise development environment source code (applicable to development environments not covered by threat SV-SP-1, SV-SP-3, and SV-SP-4). |
|
SV-SP-2 |
Testing only focuses on functional requirements and rarely considers end to end or abuse cases |
|
SV-SP-4 |
General supply chain interruption or manipulation |
|
SV-SP-5 |
Hardware failure (i.e., tainted hardware) {ASIC and FPGA focused} |
Requirement | Rationale/Additional Guidance/Notes |
---|---|
The [organization] shall identify the applicable physical and environmental protection policies covering the development environment and spacecraft hardware. {PE-1,PE-14,SA-3,SA-3(1),SA-10(3)} | |
The [organization] shall develop and document program-specific identification and authentication policies for accessing the development environment and spacecraft. {AC-3,AC-14,IA-1,SA-3,SA-3(1)} | |
The [organization] shall protect documentation and Controlled Unclassified Information (CUI) as required, in accordance with the risk management strategy.{AC-3,CM-12,CP-2,PM-17,RA-5(4),SA-3,SA-3(1),SA-5,SA-10,SC-8(1),SC-28(3),SI-12} | |
The [organization] shall identify and properly classify mission sensitive design/operations information and access control shall be applied in accordance with classification guides and applicable federal laws, Executive Orders, directives, policies, regulations, and standards.{SV-CF-3,SV-AV-5}{AC-3,CM-12,CP-2,PM-17,RA-5(4),SA-3,SA-3(1),SA-5,SA-8(19),SC-8(1),SC-28(3),SI-12} | * Mission sensitive information should be classified as Controlled Unclassified Information (CUI) or formally known as Sensitive but Unclassified. Ideally these artifacts would be rated SECRET or higher and stored on classified networks. Mission sensitive information can typically include a wide range of candidate material: the functional and performance specifications, the RF ICDs, databases, scripts, simulation and rehearsal results/reports, descriptions of uplink protection including any disabling/bypass features, failure/anomaly resolution, and any other sensitive information related to architecture, software, and flight/ground /mission operations. This could all need protection at the appropriate level (e.g., unclassified, SBU, classified, etc.) to mitigate levels of cyber intrusions that may be conducted against the project’s networks. Stand-alone systems and/or separate database encryption may be needed with controlled access and on-going Configuration Management to ensure changes in command procedures and critical database areas are tracked, controlled, and fully tested to avoid loss of science or the entire mission. |
The [organization] shall ensure security requirements/configurations are placed in accordance with NIST 800-171 with enhancements in 800-172 on the development environments to prevent the compromise of source code from supply chain or information leakage perspective.{AC-3,SA-3,SA-3(1),SA-15} | |
In coordination with [organization], the [organization] shall prioritize and remediate flaws identified during security testing/evaluation.{CA-2,CA-5,SA-11,SI-3,SI-3(10)} | |
The [organization] shall implement a verifiable flaw remediation process into the developmental and operational configuration management process.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-2,CA-5,SA-3,SA-3(1),SA-11,SI-3,SI-3(10)} | The verifiable process should also include a cross reference to mission objectives and impact statements. Understanding the flaws discovered and how they correlate to mission objectives will aid in prioritization. |
The [organization] shall maintain evidence of the execution of the security assessment plan and the results of the security testing/evaluation.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-2,CA-8,SA-11} | |
The [organization] shall create and implement a security assessment plan that includes: (1) The types of analyses, testing, evaluation, and reviews of all software and firmware components; (2) The degree of rigor to be applied to include abuse cases and/or penetration testing; and (3) The types of artifacts produced during those processes.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-2,CA-8,SA-11,SA-11(5)} | The security assessment plan should include evaluation of mission objectives in relation to the security of the mission. Assessments should not only be control based but also functional based to ensure mission is resilient against failures of controls. |
The [organization] shall establish robust procedures and technical methods to perform testing to include adversarial testing (i.e.abuse cases) of the platform hardware and software.{CA-8,CP-4(5),RA-5,RA-5(1),RA-5(2),SA-3,SA-4(3),SA-11,SA-11(1),SA-11(2),SA-11(5),SA-11(7),SA-11(8),SA-15(7)} | |
The [organization] shall define the secure communication protocols to be used within the mission in accordance with applicable federal laws, Executive Orders, directives, policies, regulations, and standards.{PL-7,RA-5(4),SA-4(9),SA-8(18),SA-8(19),SC-8(1),SC-16(3),SC-40(4),SI-12} | |
The [organization] shall perform static binary analysis of all firmware that is utilized on the spacecraft.{SV-SP-7,SV-SP-11}{RA-5,SA-10,SA-11,SI-7(10)} | Many commercial products/parts are utilized within the system and should be analyzed for security weaknesses. Blindly accepting the firmware is free of weakness is unacceptable for high assurance missions. The intent is to not blindly accept firmware from unknown sources and assume it is secure. This is meant to apply to firmware the vendors are not developing internally. In-house developed firmware should be going through the vendor's own testing program and have high assurance it is secure. When utilizing firmware from other sources, "expecting" does not meet this requirement. Each supplier needs to provide evidence to support that claim that their firmware they are getting is genuine and secure. |
The [organization] shall correct flaws identified during security testing/evaluation.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11} | Flaws that impact the mission objectives should be prioritized. |
The [organization] shall perform [Selection (one or more): unit; integration; system; regression] testing/evaluation at [Program-defined depth and coverage].{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11} | The depth needs to include functional testing as well as negative/abuse testing. |
The [organization] shall define acceptable coding languages to be used by the software developer.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-15} | |
The [organization] shall define acceptable secure coding standards for use by the software developers.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-15} | |
The [organization] shall have automated means to evaluate adherence to coding standards.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-15,SA-15(7),RA-5} | Manual review cannot scale across the code base; you must have a way to scale in order to confirm your coding standards are being met. The intent is for automated means to ensure code adheres to a coding standard. |
The [organization] shall require subcontractors developing information system components or providing information system services (as appropriate) to demonstrate the use of a system development life cycle that includes [state-of-the-practice system/security engineering methods, software development methods, testing/evaluation/validation techniques, and quality control processes].{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-9}{SA-3,SA-4(3)} | Select the particular subcontractors, software vendors, and manufacturers based on the criticality analysis performed for the Program Protection Plan and the criticality of the components that they supply. |
The [organization] shall define security requirements/configurations for development environments to prevent the compromise of source code from supply chain or information leakage perspective.{SV-SP-10}{SA-15} | Source code should be classified as Controlled Unclassified Information (CUI) or formally known as Sensitive but Unclassified. Ideally source code would be rated SECRET or higher and stored on classified networks. NIST 800-171 is insufficient when protecting highly sensitive unclassified information and more robust controls from NIST SP 800-53 and CNSSI 1253 should be employed. Greater scrutiny must be applied to all development environments. |
For FPGA pre-silicon artifacts that are developed, coded, and tested by a developer that is not accredited, the [organization] shall be subjected to a development environment and pre-silicon artifacts risk assessment by [organization]. Based on the results of the risk assessment, the [organization] may need to implement protective measures or other processes to ensure the integrity of the FPGA pre-silicon artifacts.{SV-SP-5}{SA-3,SA-3(1),SA-8(9),SA-8(11),SA-12,SA-12(1),SR-1,SR-5} | DOD-I-5200.44 requires the following: 4.c.2 “Control the quality, configuration, and security of software, firmware, hardware, and systems throughout their lifecycles... Employ protections that manage risk in the supply chain… (e.g., integrated circuits, field-programmable gate arrays (FPGA), printed circuit boards) when they are identifiable (to the supplier) as having a DOD end-use. “ 4.e “In applicable systems, integrated circuit-related products and services shall be procured from a Trusted supplier accredited by the Defense Microelectronics Activity (DMEA) when they are custom-designed, custommanufactured, or tailored for a specific DOD military end use (generally referred to as application-specific integrated circuits (ASIC)). “ 1.g “In coordination with the DOD CIO, the Director, Defense Intelligence Agency (DIA), and the Heads of the DOD Components, develop a strategy for managing risk in the supply chain for integrated circuit-related products and services (e.g., FPGAs, printed circuit boards) that are identifiable to the supplier as specifically created or modified for DOD (e.g., military temperature range, radiation hardened). |
The [organization] shall require the developer of the system, system component, or system services to demonstrate the use of a system development life cycle that includes [state-of-the-practice system/security engineering methods, software development methods, testing/evaluation/validation techniques, and quality control processes].{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-9}{SA-3,SA-4(3)} | Examples of good security practices would be using defense-in-depth tactics across the board, least-privilege being implemented, two factor authentication everywhere possible, using DevSecOps, implementing and validating adherence to secure coding standards, performing static code analysis, component/origin analysis for open source, fuzzing/dynamic analysis with abuse cases, etc. |
The [organization] shall ensure that the allocated security safeguards operate in a coordinated and mutually reinforcing manner.{SV-MA-6}{CA-7(5),PL-7,PL-8(1),SA-8(19)} | |
The [organization] shall document and design a security architecture using a defense-in-depth approach that allocates the [organization]s defined safeguards to the indicated locations and layers: [Examples include: operating system abstractions and hardware mechanisms to the separate processors in the platform, internal components, and the FSW].{SV-MA-6}{CA-9,PL-7,PL-8,PL-8(1),SA-8(3),SA-8(4),SA-8(7),SA-8(9),SA-8(11),SA-8(13),SA-8(19),SA-8(29),SA-8(30)} | |
The [organization] shall implement a security architecture and design that provides the required security functionality, allocates security controls among physical and logical components, and integrates individual security functions, mechanisms, and processes together to provide required security capabilities and a unified approach to protection.{SV-MA-6}{PL-7,SA-2,SA-8,SA-8(1),SA-8(2),SA-8(3),SA-8(4),SA-8(5),SA-8(6),SA-8(7),SA-8(9),SA-8(11),SA-8(13),SA-8(19),SA-8(29),SA-8(30),SC-32,SC-32(1)} | |
The [organization] shall define acceptable secure communication protocols available for use within the mission in accordance with applicable federal laws, Executive Orders, directives, policies, regulations, and standards.{SV-AC-7}{SA-4(9)} | The secure communication protocol should include "strong" authenticated encryption characteristics. |
The [spacecraft] shall only use [organization]-defined communication protocols within the mission.{SV-AC-7}{SA-4(9)} | |
The [spacecraft] shall only use communication protocols that support encryption within the mission.{SA-4(9),SA-8(18),SA-8(19),SC-40(4)} |