Secure boot

Software/Firmware must verify a trust chain that extends through the hardware root of trust, boot loader, boot configuration file, and operating system image, in that order. The trusted boot/RoT computing module should be implemented on radiation tolerant burn-in (non-programmable) equipment. 

Sources

Best Segment for Countermeasure Deployment

  • Space Segment

NIST Rev5 Controls

D3FEND

ISO 27001

ID: CM0014
D3FEND Artifacts: 
Created: 2022/10/19
Last Modified: 2022/12/08

Techniques Addressed by Countermeasure

here here here here here here here here here
ID Name Description
IA-0001 Compromise Supply Chain Threat actors may manipulate or compromise products or product delivery mechanisms before the customer receives them in order to achieve data or system compromise.
.02 Software Supply Chain Threat actors may manipulate software binaries and applications prior to the customer receiving them in order to achieve data or system compromise. This attack can take place in a number of ways, including manipulation of source code, manipulation of the update and/or distribution mechanism, or replacing compiled versions with a malicious one.
IA-0002 Compromise Software Defined Radio Threat actors may target software defined radios due to their software nature to establish C2 channels. Since SDRs are programmable, when combined with supply chain or development environment attacks, SDRs provide a pathway to setup covert C2 channels for a threat actor.
IA-0004 Secondary/Backup Communication Channel Threat actors may compromise alternative communication pathways which may not be as protected as the primary pathway. Depending on implementation the contingency communication pathways/solutions may lack the same level of security (i.e., physical security, encryption, authentication, etc.) which if forced to use could provide a threat actor an opportunity to launch attacks. Typically these would have to be coupled with other denial of service techniques on the primary pathway to force usage of secondary pathways.
.02 Receiver Threat actors may target the backup/secondary receiver on the space vehicle as a method to inject malicious communications into the mission. The secondary receivers may come from different supply chains than the primary which could have different level of security and weaknesses. Similar to the ground station, the communication through the secondary receiver could be forced or happening naturally.
IA-0007 Compromise Ground System Threat actors may initially compromise the ground system in order to access the target spacecraft. Once compromised, the threat actor can perform a multitude of initial access techniques, including replay, compromising FSW deployment, compromising encryption keys, and compromising authentication schemes. Threat actors may also perform further reconnaissance within the system to enumerate mission networks and gather information related to ground station logical topology, missions ran out of said ground station, birds that are in-band of targeted ground stations, and other mission system capabilities.
.01 Compromise On-Orbit Update Threat actors may manipulate and modify on-orbit updates before they are sent to the target spacecraft. This attack can be done in a number of ways, including manipulation of source code, manipulating environment variables, on-board table/memory values, or replacing compiled versions with a malicious one.
EX-0004 Compromise Boot Memory Threat actors may manipulate boot memory in order to execute malicious code, bypass internal processes, or DoS the system. This technique can be used to perform other tactics such as Defense Evasion.
EX-0005 Exploit Hardware/Firmware Corruption Threat actors can target the underlying hardware and/or firmware using various TTPs that will be dependent on the specific hardware/firmware. Typically, software tools (e.g., antivirus, antimalware, intrusion detection) can protect a system from threat actors attempting to take advantage of those vulnerabilities to inject malicious code. However, there exist security gaps that cannot be closed by the above-mentioned software tools since they are not stationed on software applications, drivers or the operating system but rather on the hardware itself. Hardware components, like memory modules and caches, can be exploited under specific circumstances thus enabling backdoor access to potential threat actors. In addition to hardware, the firmware itself which often is thought to be software in its own right also provides an attack surface for threat actors. Firmware is programming that's written to a hardware device's non-volatile memory where the content is saved when a hardware device is turned off or loses its external power source. Firmware is written directly onto a piece of hardware during manufacturing and it is used to run on the device and can be thought of as the software that enables hardware to run. In the space vehicle context, firmware and field programmable gate array (FPGA)/application-specific integrated circuit (ASIC) logic/code is considered equivalent to firmware.
.01 Design Flaws Threat actors may target design features/flaws with the hardware design to their advantage to cause the desired impact. Threat actors may utilize the inherent design of the hardware (e.g. hardware timers, hardware interrupts, memory cells), which is intended to provide reliability, to their advantage to degrade other aspects like availability. Additionally, field programmable gate array (FPGA)/application-specific integrated circuit (ASIC) logic can be exploited just like software code can be exploited. There could be logic/design flaws embedded in the hardware (i.e., FPGA/ASIC) which may be exploitable by a threat actor.
PER-0001 Memory Compromise Threat actors may manipulate memory (boot, RAM, etc.) in order for their malicious code and/or commands to remain on the victim spacecraft. The spacecraft may have mechanisms that allow for the automatic running of programs on system reboot, entering or returning to/from safe mode, or during specific events. Threat actors may target these specific memory locations in order to store their malicious code or file, ensuring that the attack remains on the system even after a reset.
PER-0002 Backdoor Threat actors may find and target various backdoors, or inject their own, within the victim spacecraft in the hopes of maintaining their attack.
.02 Software Threat actors may inject code to create their own backdoor to establish persistent access to the spacecraft. This may be done through modification of code throughout the software supply chain or through modification of the software-defined radio configuration (if applicable).
DE-0007 Rootkit Rootkits are programs that hide the existence of malware by intercepting/hooking and modifying operating system API calls that supply system information. Rootkits or rootkit enabling functionality may reside at the flight software or kernel level in the operating system or lower, to include a hypervisor, Master Boot Record, or System Firmware.
DE-0008 Bootkit Adversaries may use bootkits to persist on systems and evade detection. Bootkits reside at a layer below the operating system and may make it difficult to perform full remediation unless an organization suspects one was used and can act accordingly.
EXF-0006 Modify Communications Configuration Threat actors can manipulate communications equipment, modifying the existing software, hardware, or the transponder configuration to exfiltrate data via unintentional channels the mission has no control over.
.01 Software Defined Radio Threat actors may target software defined radios due to their software nature to setup exfiltration channels. Since SDRs are programmable, when combined with supply chain or development environment attacks, SDRs provide a pathway to setup covert exfiltration channels for a threat actor.
.02 Transponder Threat actors may change the transponder configuration to exfiltrate data via radio access to an attacker-controlled asset.

Space Threats Addressed by Countermeasure

ID Description
SV-IT-2 Unauthorized modification or corruption of data  
SV-SP-9 On-orbit software updates/upgrades/patches/direct memory writes. If TT&C is compromised or MOC or even the developer's environment, the risk exists to do a variation of a supply chain attack where after it is in orbit you inject malicious code  
SV-IT-3 Compromise boot memory  
SV-SP-11 Software defined radios - SDR is also another computer, networked to other parts of the spacecraft that could be pivoted to by an attacker and infected with malicious code. Once access to an SDR is gained, the attacker could alter what the SDR thinks is correct frequencies and settings to communicate with the ground.  
SV-SP-7 Software can be broken down into three levels (operating system and drivers’ layer, data handling service layer, and the application layer). Highest impact on system is likely the embedded code at the BIOS, kernel/firmware level. Attacking the on-board operating systems. Since it manages all the programs and applications on the computer, it has a critical role in the overall security of the system. Since threats may occur deliberately or due to human error, malicious programs or persons, or existing system vulnerability mitigations must be deployed to protect the OS.  

Low-Level Requirements

Requirement Rationale/Additional Guidance/Notes
The [organization] shall implement a verifiable flaw remediation process into the developmental and operational configuration management process.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-2,CA-5,SA-3,SA-3(1),SA-11,SI-3,SI-3(10)} The verifiable process should also include a cross reference to mission objectives and impact statements. Understanding the flaws discovered and how they correlate to mission objectives will aid in prioritization.
The [organization] shall verify that the scope of security testing/evaluation provides complete coverage of required security controls (to include abuse cases and penetration testing) at the depth of testing defined in the test documents.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-2,CA-8,RA-5(3),SA-11(5),SA-11(7)} * The frequency of testing should be driven by Program completion events and updates. * Examples of approaches are static analyses, dynamic analyses, binary analysis, or a hybrid of the three approaches
The [organization] shall maintain evidence of the execution of the security assessment plan and the results of the security testing/evaluation.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-2,CA-8,SA-11}
The [organization] shall create and implement a security assessment plan that includes: (1) The types of analyses, testing, evaluation, and reviews of all software and firmware components; (2) The degree of rigor to be applied to include abuse cases and/or penetration testing; and (3) The types of artifacts produced during those processes.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-2,CA-8,SA-11,SA-11(5)} The security assessment plan should include evaluation of mission objectives in relation to the security of the mission. Assessments should not only be control based but also functional based to ensure mission is resilient against failures of controls.
The [organization] shall determine the vulnerabilities/weaknesses that require remediation, and coordinate the timeline for that remediation, in accordance with the analysis of the vulnerability scan report, the mission assessment of risk, and mission needs.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-5,CM-3,RA-5,RA-7,SI-3,SI-3(10)}
The [organization] shall employ dynamic analysis (e.g.using simulation, penetration testing, fuzzing, etc.) to identify software/firmware weaknesses and vulnerabilities in developed and incorporated code (open source, commercial, or third-party developed code).{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-8,CM-10(1),RA-3(1),SA-11(5),SA-11(8),SA-11(9),SI-3,SI-7(10)}
The [organization] shall perform penetration testing/analysis: (1) On potential system elements before accepting the system; (2) As a realistic simulation of the active adversary’s known adversary tactics, techniques, procedures (TTPs), and tools; and (3) Throughout the lifecycle on physical and logical systems, elements, and processes.{SV-SP-3,SV-SP-4,SV-AV-7,SV-SP-11}{CA-8(1),SA-9,SA-11(5),SR-5(2)} Penetration testing should be performed throughout the lifecycle on physical and logical systems, elements, and processes including: (1) Hardware, software, and firmware development processes; (2) Shipping/handling procedures; (3) Personnel and physical security programs; (4) Configuration management tools/measures to maintain provenance; and (5) Any other programs, processes, or procedures associated with the production/distribution of supply chain elements. 
The [organization] shall maintain a list of suppliers and potential suppliers used, and the products that they supply to include software.{SV-SP-3,SV-SP-4,SV-SP-11}{CM-10,PL-8(2),PM-30,SA-8(9),SA-8(11)} Ideally you have diversification with suppliers
The [organization] shall test software and firmware updates related to flaw remediation for effectiveness and potential side effects on mission systems in a separate test environment before installation.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CM-3,CM-3(1),CM-3(2),CM-4(1),CM-4(2),CM-10(1),SA-8(31),SA-11(9),SI-2,SI-3,SI-3(10),SI-7(10),SI-7(12),SR-5(2)} This requirement is focused on software and firmware flaws. If hardware flaw remediation is required, refine the requirement to make this clear. 
The [organization] shall define processes and procedures to be followed when integrity verification tools detect unauthorized changes to software, firmware, and information.{SV-IT-2}{CM-3,CM-3(1),CM-3(5),CM-5(6),CM-6,CP-2,IR-6,IR-6(2),PM-30,SC-16(1),SC-51,SI-3,SI-4(7),SI-4(24),SI-7,SI-7(7),SI-7(10)}
The [organization] shall release updated versions of the mission information systems incorporating security-relevant software and firmware updates, after suitable regression testing, at a frequency no greater than [Program-defined frequency [90 days]].{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CM-3(2),CM-4(1)} On-orbit patching/upgrades may be necessary if vulnerabilities are discovered after launch. The system should have the ability to update software post-launch.
The [organization] shall develop and implement anti-counterfeit policy and procedures designed to detect and prevent counterfeit components from entering the information system, including support tamper resistance and provide a level of protection against the introduction of malicious code or hardware.{SV-SP-3,SV-SP-4,SV-AV-7,SV-SP-11}{CM-3(8),CM-7(9),PM-30,SA-8(9),SA-8(11),SA-9,SA-10(3),SA-19,SC-51,SR-4(3),SR-4(4),SR-5(2),SR-11}
The [organization] shall define/maintain an approved operating system list for use on spacecraft.{SV-SP-7}{CM-7(5)} The operating system is extremely important to security and availability of the spacecraft, therefore should receive high levels of assurance that it operates as intended and free of critical weaknesses/vulnerabilities. 
The [organization] shall prohibit the use of binary or machine-executable code from sources with limited or no warranty and without the provision of source code.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CM-7(8)}
The [organization] shall report identified systems or system components containing software affected by recently announced cybersecurity-related software flaws (and potential vulnerabilities resulting from those flaws) to [organization] officials with cybersecurity responsibilities.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-11}{IR-6,IR-6(2),SI-2,SI-3,SI-4(12),SR-4(4)}
The [organization] shall use all-source intelligence analysis of suppliers and potential suppliers of the information system, system components, or system services to inform engineering, acquisition, and risk management decisions.{SV-SP-3,SV-SP-4,SV-AV-7,SV-SP-11}{PM-16,PM-30,RA-2,RA-3(1),RA-3(2),RA-7,SA-9,SA-12(8),SR-5(2)} * The Program should also consider sub suppliers and potential sub suppliers. * All-source intelligence of suppliers that the organization may use includes: (1) Defense Intelligence Agency (DIA) Threat Assessment Center (TAC), the enterprise focal point for supplier threat assessments for the DOD acquisition community risks; (2) Other U.S. Government resources including: (a) Government Industry Data Exchange Program (GIDEP) – Database where government and industry can record issues with suppliers, including counterfeits; and (b) System for Award Management (SAM) – Database of companies that are barred from doing business with the US Government. 
The [organization] shall request threat analysis of suppliers of critical components and manage access to and control of threat analysis products containing U.S.person information.{SV-SP-3,SV-SP-4,SV-SP-11}{PM-16,PM-30(1),RA-3(1),SA-9,SA-12,SR-1} The intent of this requirement is to address supply chain concerns on hardware and software vendors. Not required for trusted suppliers accredited to the Defense Microelectronic Activity (DMEA). If the Program intends to use a supplier not accredited by DMEA, the government customer should be notified as soon as possible. If the Program has internal processes to vet suppliers, it may meet this requirement. All software used and its origins must be included in the SBOM and be subjected to internal and Government vulnerability scans.
The [organization] shall protect against supply chain threats to the system, system components, or system services by employing security safeguards as defined by NIST SP 800-161 Rev.1.{SV-SP-3,SV-SP-4,SV-AV-7,SV-SP-11}{PM-30,RA-3(1),SA-8(9),SA-8(11),SA-12,SI-3,SR-1} The chosen supply chain safeguards should demonstrably support a comprehensive, defense-in-breadth information security strategy. Safeguards should include protections for both hardware and software. Program should define their critical components (HW & SW) and identify the supply chain protections, approach/posture/process.
The [organization] shall use the threat and vulnerability analyses of the as-built system, system components, or system services to inform and direct subsequent testing/evaluation of the as-built system, component, or service.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-3(3),SA-11(2),SA-15(8),SI-3}
The [organization] shall ensure that the vulnerability scanning tools (e.g., static analysis and/or component analysis tools) used include the capability to readily update the list of potential information system vulnerabilities to be scanned.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-5,RA-5(1),RA-5(3),SI-3}
The [organization] shall perform vulnerability analysis and risk assessment of all systems and software.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-5,RA-5(3),SA-15(7),SI-3}
The [organization] shall ensure that vulnerability scanning tools and techniques are employed that facilitate interoperability among tools and automate parts of the vulnerability management process by using standards for: (1) Enumerating platforms, custom software flaws, and improper configurations; (2) Formatting checklists and test procedures; and (3) Measuring vulnerability impact.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-5,RA-5(3),SI-3} Component/Origin scanning looks for open-source libraries/software that may be included into the baseline and looks for known vulnerabilities and open-source license violations.
The [organization] shall perform static binary analysis of all firmware that is utilized on the spacecraft.{SV-SP-7,SV-SP-11}{RA-5,SA-10,SA-11,SI-7(10)} Many commercial products/parts are utilized within the system and should be analyzed for security weaknesses. Blindly accepting the firmware is free of weakness is unacceptable for high assurance missions. The intent is to not blindly accept firmware from unknown sources and assume it is secure. This is meant to apply to firmware the vendors are not developing internally. In-house developed firmware should be going through the vendor's own testing program and have high assurance it is secure. When utilizing firmware from other sources, "expecting" does not meet this requirement. Each supplier needs to provide evidence to support that claim that their firmware they are getting is genuine and secure.
The [organization] shall perform static source code analysis for all available source code looking for [[organization]-defined Top CWE List] weaknesses using complimentary set of static code analysis tools (i.e.more than one).{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-5,SA-11(1),SA-15(7)}
The [organization] shall analyze vulnerability/weakness scan reports and results from security control assessments.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-5,SI-3}
The [organization] shall ensure that the list of potential system vulnerabilities scanned is updated [prior to a new scan] {SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-5(2),SI-3}
The [organization] shall perform configuration management during system, component, or service during [design; development; implementation; operations].{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-10}
The [organization] shall review proposed changes to the spacecraft, assessing both mission and security impacts.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-10,CM-3(2)}
The [organization] shall correct flaws identified during security testing/evaluation.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11} Flaws that impact the mission objectives should be prioritized.
The [organization] shall perform [Selection (one or more): unit; integration; system; regression] testing/evaluation at [Program-defined depth and coverage].{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11} The depth needs to include functional testing as well as negative/abuse testing.
The [organization] shall create prioritized list of software weakness classes (e.g., Common Weakness Enumerations) to be used during static code analysis for prioritization of static analysis results.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11(1),SA-15(7)} The prioritized list of CWEs should be created considering operational environment, attack surface, etc. Results from the threat modeling and attack surface analysis should be used as inputs into the CWE prioritization process. There is also a CWSS (https://cwe.mitre.org/cwss/cwss_v1.0.1.html) process that can be used to prioritize CWEs. The prioritized list of CWEs can help with tools selection as well as you select tools based on their ability to detect certain high priority CWEs.
The [organization] shall use threat modeling and vulnerability analysis to inform the current development process using analysis from similar systems, components, or services where applicable.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11(2),SA-15(8)}
The [organization] shall perform and document threat and vulnerability analyses of the as-built system, system components, or system services.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11(2),SI-3}
The [organization] shall perform a manual code review of all flight code.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11(4)}
The [organization] shall conduct an Attack Surface Analysis and reduce attack surfaces to a level that presents a low level of compromise by an attacker.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11(6),SA-15(5)}
The [organization] shall define acceptable coding languages to be used by the software developer.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-15}
The [organization] shall define acceptable secure coding standards for use by the software developers.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-15}
The [organization] shall have automated means to evaluate adherence to coding standards.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-15,SA-15(7),RA-5} Manual review cannot scale across the code base; you must have a way to scale in order to confirm your coding standards are being met. The intent is for automated means to ensure code adheres to a coding standard.
The [organization] shall perform component analysis (a.k.a.origin analysis) for developed or acquired software.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-15(7),RA-5}
The [organization] shall require subcontractors developing information system components or providing information system services (as appropriate) to demonstrate the use of a system development life cycle that includes [state-of-the-practice system/security engineering methods, software development methods, testing/evaluation/validation techniques, and quality control processes].{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-9}{SA-3,SA-4(3)} Select the particular subcontractors, software vendors, and manufacturers based on the criticality analysis performed for the Program Protection Plan and the criticality of the components that they supply. 
The [organization] shall require the developer of the system, system component, or system service to deliver the system, component, or service with [Program-defined security configurations] implemented.{SV-SP-1,SV-SP-9}{SA-4(5)} For the spacecraft FSW, the defined security configuration could include to ensure the software does not contain a pre-defined list of Common Weakness Enumerations (CWEs)and/or CAT I/II Application STIGs.
The [organization] shall use a certified environment to develop, code and test executable software (firmware or bit-stream) that will be programmed into a one-time programmable FPGA or be programmed into non-volatile memory (NVRAM) that the FPGA executes.{SA-8(9),SA-8(11),SA-12,SA-12(1),SC-51,SI-7(10),SR-1,SR-5}
The [organization] shall correct reported cybersecurity-related information system flaws.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SI-2} * Although this requirement is stated to specifically apply to cybersecurity-related flaws, the Program office may choose to broaden it to all SV flaws. * This requirement is allocated to the Program, as it is presumed, they have the greatest knowledge of the components of the system and when identified flaws apply. 
The [organization] shall identify, report, and coordinate correction of cybersecurity-related information system flaws.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SI-2}
The [organization] shall develop and implement anti-counterfeit policy and procedures, in coordination with the [CIO], that is demonstrably consistent with the anti-counterfeit policy defined by the Program office.{SV-SP-4,SV-SP-11}{SR-11}
The [organization] shall employ [organization]-defined techniques to limit harm from potential adversaries identifying and targeting the Program supply chain.{SV-SP-3,SV-SP-4,SV-AV-7,SV-SP-11}{SR-3(2),SC-38} Examples of security safeguards that the organization should consider implementing to limit the harm from potential adversaries targeting the organizational supply chain, are: (1) Using trusted physical delivery mechanisms that do not permit access to the element during delivery (ship via a protected carrier, use cleared/official couriers, or a diplomatic pouch); (2) Using trusted electronic delivery of products and services (require downloading from approved, verification-enhanced sites); (3) Avoiding the purchase of custom configurations, where feasible; (4) Using procurement carve outs (i.e., exclusions to commitments or obligations), where feasible; (5) Using defensive design approaches; (6) Employing system OPSEC principles; (7) Employing a diverse set of suppliers; (8) Employing approved vendor lists with standing reputations in industry; (9) Using a centralized intermediary and “Blind Buy” approaches to acquire element(s) to hide actual usage locations from an untrustworthy supplier or adversary Employing inventory management policies and processes; (10) Using flexible agreements during each acquisition and procurement phase so that it is possible to meet emerging needs or requirements to address supply chain risk without requiring complete revision or re-competition of an acquisition or procurement; (11) Using international, national, commercial or government standards to increase potential supply base; (12) Limiting the disclosure of information that can become publicly available; and (13) Minimizing the time between purchase decisions and required delivery. 
The [organization] shall employ the [organization]-defined approaches for the purchase of the system, system components, or system services from suppliers.{SV-SP-3,SV-SP-4,SV-AV-7,SV-SP-11}{SR-5} This could include tailored acquisition strategies, contract tools, and procurement methods.
The [organization] (and Prime Contractor) shall conduct a supplier review prior to entering into a contractual agreement with a contractor (or sub-contractor) to acquire systems, system components, or system services.{SV-SP-3,SV-SP-4,SV-AV-7,SV-SP-11}{SR-6}
The [organization] shall employ [Selection (one or more): independent third-party analysis, Program penetration testing, independent third-party penetration testing] of [Program-defined supply chain elements, processes, and actors] associated with the system, system components, or system services.{SV-SP-3,SV-SP-4,SV-AV-7,SV-SP-11}{SR-6(1)}
The [organization] shall employ [Program-defined Operations Security (OPSEC) safeguards] to protect supply chain-related information for the system, system components, or system services.{SV-SP-3,SV-SP-4,SV-AV-7,SV-SP-11}{SR-7,SC-38,CP-2(8)} OPSEC safeguards may include: (1) Limiting the disclosure of information needed to design, develop, test, produce, deliver, and support the element for example, supplier identities, supplier processes, potential suppliers, security requirements, design specifications, testing and evaluation result, and system/component configurations, including the use of direct shipping, blind buys, etc.; (2) Extending supply chain awareness, education, and training for suppliers, intermediate users, and end users; (3) Extending the range of OPSEC tactics, techniques, and procedures to potential suppliers, contracted suppliers, or sub-prime contractor tier of suppliers; and (4) Using centralized support and maintenance services to minimize direct interactions between end users and original suppliers.
The [organization] shall enable integrity verification of hardware components.{SA-10(3),SA-8(21),SA-10(3),SC-51} * The integrity verification mechanisms may include:  ** Stipulating and monitoring logical delivery of products and services, requiring downloading from approved, verification-enhanced sites; ** Encrypting elements (software, software patches, etc.) and supply chain process data in transit (motion) and at rest throughout delivery; ** Requiring suppliers to provide their elements “secure by default”, so that additional configuration is required to make the element insecure; ** Implementing software designs using programming languages and tools that reduce the likelihood of weaknesses; ** Implementing cryptographic hash verification; and ** Establishing performance and sub-element baseline for the system and system elements to help detect unauthorized tampering/modification during repairs/refurbishing. ** Stipulating and monitoring logical delivery of products and services, requiring downloading from approved, verification-enhanced sites; ** Encrypting elements (software, software patches, etc.) and supply chain process data in transit (motion) and at rest throughout delivery; ** Requiring suppliers to provide their elements “secure by default”, so that additional configuration is required to make the element insecure; ** Implementing software designs using programming languages and tools that reduce the likelihood of weaknesses; ** Implementing cryptographic hash verification; and ** Establishing performance and sub-element baseline for the system and system elements to help detect unauthorized tampering/modification during repairs/refurbishing.
The [organization] shall enable integrity verification of software and firmware components.{SV-IT-2}{CM-3(5),CM-5(6),CM-10(1),SA-8(9),SA-8(11),SA-8(21),SA-10(1),SI-3,SI-4(24),SI-7,SI-7(10),SI-7(12),SR-4(4)} * The integrity verification mechanisms may include:  ** Stipulating and monitoring logical delivery of products and services, requiring downloading from approved, verification-enhanced sites; ** Encrypting elements (software, software patches, etc.) and supply chain process data in transit (motion) and at rest throughout delivery; ** Requiring suppliers to provide their elements “secure by default”, so that additional configuration is required to make the element insecure; ** Implementing software designs using programming languages and tools that reduce the likelihood of weaknesses; ** Implementing cryptographic hash verification; and ** Establishing performance and sub-element baseline for the system and system elements to help detect unauthorized tampering/modification during repairs/refurbishing. ** Stipulating and monitoring logical delivery of products and services, requiring downloading from approved, verification-enhanced sites; ** Encrypting elements (software, software patches, etc.) and supply chain process data in transit (motion) and at rest throughout delivery; ** Requiring suppliers to provide their elements “secure by default”, so that additional configuration is required to make the element insecure; ** Implementing software designs using programming languages and tools that reduce the likelihood of weaknesses; ** Implementing cryptographic hash verification; and ** Establishing performance and sub-element baseline for the system and system elements to help detect unauthorized tampering/modification during repairs/refurbishing.
The [organization] shall require the developer of the system, system component, or system services to demonstrate the use of a system development life cycle that includes [state-of-the-practice system/security engineering methods, software development methods, testing/evaluation/validation techniques, and quality control processes].{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-9}{SA-3,SA-4(3)} Examples of good security practices would be using defense-in-depth tactics across the board, least-privilege being implemented, two factor authentication everywhere possible, using DevSecOps, implementing and validating adherence to secure coding standards, performing static code analysis, component/origin analysis for open source, fuzzing/dynamic analysis with abuse cases, etc.
The [spacecraft] shall require multi-factor authorization for all spacecraft [applications or operating systems] updates within the spacecraft.{SV-SP-9,SV-SP-11}{AC-3(2),CM-3(8),CM-5,PM-12,SA-8(8),SA-8(31),SA-10(2),SI-3(8),SI-7(12),SI-10(6)} The intent is for multiple checks to be performed prior to executing these SV SW updates. One action is mere act of uploading the SW to the spacecraft. Another action could be check of digital signature (ideal but not explicitly required) or hash or CRC or a checksum. Crypto boxes provide another level of authentication for all commands, including SW updates but ideally there is another factor outside of crypto to protect against FSW updates. Multi-factor authorization could be the "two-man rule" where procedures are in place to prevent a successful attack by a single actor (note: development activities that are subsequently subject to review or verification activities may already require collaborating attackers such that a "two-man rule" is not appropriate).
The [spacecraft] shall prevent the installation of Flight Software without verification that the component has been digitally signed using a certificate that is recognized and approved by the ground.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-9}{CM-3,CM-3(8),CM-5,CM-5(3),CM-14,SA-8(8),SA-8(31),SA-10(2),SI-3,SI-7(12),SI-7(15)}
The [organization] shall employ automated tools that provide notification to ground operators upon discovering discrepancies during integrity verification.{CM-3(5),CM-6,IR-6,IR-6(2),SA-8(21),SC-51,SI-3,SI-4(7),SI-4(12),SI-4(24),SI-7(2)}
The [spacecraft] shall provide automatic notification to ground operators upon discovering discrepancies during integrity verification.{SV-IT-2}{CM-3(5),SA-8(21),SI-3,SI-4(7),SI-4(12),SI-4(24),SI-7(2)}
The [organization] shall ensure that software planned for reuse meets the fit, form, and function, and security as a component within the new application.{SV-SP-6,SV-SP-7,SV-SP-11}{CM-7(5)}
The [spacecraft] operating system, if COTS or FOSS, shall be selected from a [organization]-defined acceptance list.{SV-SP-7}{CM-7(8),CM-7(5)}
The [organization] shall define the security safeguards that are to be automatically employed when integrity violations are discovered.{SV-IT-2}{CP-2,SA-8(21),SI-3,SI-4(7),SI-4(12),SI-7(5),SI-7(8)}
The [spacecraft] shall retain the capability to update/upgrade operating systems while on-orbit.{SV-SP-7}{SA-4(5),SA-8(8),SA-8(31),SA-10(2),SI-3} The operating system updates should be performed using multi-factor authorization and should only be performed when risk of compromise/exploitation of identified vulnerability outweighs the risk of not performing the update.
The [spacecraft] boot firmware must validate the boot loader, boot configuration file, and operating system image, in that order, against their respective signatures.{SV-IT-3}{SA-8(10),SA-8(11),SA-8(12),SI-7(9),SI-7(10)} A signature is ~770 bits long. No requirement is imposed on the storage location of signatures.
The [spacecraft] boot firmware must verify a trust chain that extends through the hardware root of trust, boot loader, boot configuration file, and operating system image, in that order.{SV-IT-3}{SA-8(10),SA-8(11),SA-8(12),SI-7(9),SI-7(10)} These three items were chosen because they’re intended to be static values (once properly set up) but are in volatile storage. Also, the Boot ROM can’t be modified, so there’s no reason to check a signature.
The [spacecraft] trusted boot/RoT computing module shall be implemented on radiation tolerant burn-in (non-programmable) equipment.{SA-8(10),SA-8(11),SA-8(12),SI-7(9),SI-7(10)}
The [spacecraft] trusted boot/RoT shall be a separate compute engine controlling the trusted computing platform cryptographic processor.{SA-8(10),SA-8(11),SA-8(12),SI-7(9),SI-7(10)}
The [spacecraft] shall perform attestation at each stage of startup and ensure overall trusted boot regime (i.e., root of trust).{SV-IT-3}{SA-8(10),SA-8(11),SA-8(12),SI-7(9),SI-7(10),SI-7(17)} It is important for the computing module to be able to access a set of functions and commands that it trusts; that is, that it knows to be true. This concept is referred to as root of trust (RoT) and should be included in the spacecraft design. With RoT, a device can always be trusted to operate as expected. RoT functions, such as verifying the device’s own code and configuration, must be implemented in secure hardware (i.e., field programmable gate arrays). By checking the security of each stage of power-up, RoT devices form the first link in a chain of trust that protects the spacecraft
The [organization] shall define and document the transitional state or security-relevant events when the spacecraft will perform integrity checks on software, firmware, and information.{SV-IT-2}{SA-8(21),SI-7(1),SI-7(10),SR-4(4)}
The [spacecraft] shall be capable of removing flight software after updated versions have been installed.{SV-SP-1,SV-SP-9}{SA-8(8),SI-2(6)}
The [spacecraft] shall provide the capability for data connection ports or input/output devices to be disabled or removed prior to spacecraft operations.{SV-AC-5}{SA-9(2),SC-7(14),SC-41,SC-51} Intent is for external physical data ports to be disabled (logical or physical) while in operational orbit. Port disablement does not necessarily need to be irreversible.
The [spacecraft] shall protect the confidentiality and integrity of the [all information] using cryptography while it is at rest.{SV-IT-2,SV-CF-2}{SC-28,SC-28(1),SI-7(6)} * Information at rest refers to the state of information when it is located on storage devices as specific components of information systems. This is often referred to as data-at-rest encryption.
The [spacecraft] shall protect the confidentiality and integrity of all transmitted information.{SV-IT-2,SV-AC-7}{SC-8} * The intent as written is for all transmitted traffic to be protected. This includes internal to internal communications and especially outside of the boundary.
The [spacecraft] shall maintain the confidentiality and integrity of information during preparation for transmission and during reception.{SV-IT-2}{SC-8(2)} * Preparation for transmission and during reception includes the aggregation, packing, and transformation options performed prior to transmission and the undoing of those operations that occur upon receipt.
The [spacecraft] shall perform an integrity check of [Program-defined software, firmware, and information] at startup; at [Program-defined transitional states or security-relevant events] {SV-IT-2}{SI-7(1)}
The [organization] shall employ automated tools that provide notification to [Program-defined personnel] upon discovering discrepancies during integrity verification.{SV-IT-2}{SI-7(2)}
The [spacecraft] shall automatically [Selection (one or more):restarts the FSW/processor, performs side swap, audits failure; implements Program-defined security safeguards] when integrity violations are discovered.{SV-IT-2}{SI-7(8)}
The [spacecraft] hardware root of trust must be an ECDSA NIST P-384 public key.{SV-IT-3}{SI-7(9)} No requirement is imposed on uniqueness.
The [spacecraft] hardware root of trust must be loadable only once, post-purchase.{SV-IT-3}{SI-7(9)} No requirement is imposed on preventing hardware readout. The public key belongs to the customer, not the manufacturer, so it must be loaded after purchase. Also, if it can be overwritten, there’s no reason to trust it.
The [spacecraft] shall implement trusted boot/RoT as a separate compute engine controlling the trusted computing platform cryptographic processor.{SV-IT-3}{SI-7(9)}
The [spacecraft] shall implement trusted boot/RoT computing module on radiation tolerant burn-in (non-programmable) equipment.{SV-IT-3}{SI-7(9)}
The [spacecraft] boot firmware must enter a recovery routine upon failing to verify signed data in the trust chain, and not execute or trust that signed data.{SV-IT-3}{SI-7(9),SI-7(10)} No other requirements are imposed on the recovery routine besides not using the failed data. Unverifiable data isn’t trusted and shouldn’t be run. 
The [spacecraft] root of trust must be an ECDSA NIST P-384 public key.{SI-7(9),SI-7(10)}
The [spacecraft] root of trust must be loadable only once, post-purchase.{SI-7(9),SI-7(10)}
The [spacecraft] secure boot mechanism shall be Commercial National Security Algorithm Suite (CNSA) compliant.{SV-IT-3}{SI-7(9),SI-7(10)} No certification process is required (or exists). The CNSA is easy to meet, only restricts algorithm choice, and aids ease-of-use for government customers.
The [spacecraft] shall allocate enough boot ROM memory for secure boot firmware execution.{SV-IT-3}{SI-7(9),SI-7(10)}
The [spacecraft] shall allocate enough SRAM memory for secure boot firmware execution.{SV-IT-3}{SI-7(9),SI-7(10)}
The [spacecraft] shall support the algorithmic construct of Elliptic Curve Digital Signature Algorithm (ECDSA) NIST P-384 + SHA-38 or equivalent strength.{SV-IT-3}{SI-7(9),SI-7(10)} Timing data may suggest cryptographic accelerators are unnecessary. This construct was chosen because (a) it’s in the CNSA suite and (b) it doesn’t require secret values to be stored