The [organization] shall produce a plan for the continuous monitoring of security control effectiveness.{SA-4(8),CP-4(5),PM-31}
|
|
The [organization] risk assessment shall include the full end to end communication pathway (i.e., round trip) to include any crosslink communications.{SV-MA-4}{AC-20,AC-20(1),AC-20(3),RA-3,SA-8(18)}
|
|
The [organization] shall identify and properly classify mission sensitive design/operations information and access control shall be applied in accordance with classification guides and applicable federal laws, Executive Orders, directives, policies, regulations, and standards.{SV-CF-3,SV-AV-5}{AC-3,CM-12,CP-2,PM-17,RA-5(4),SA-3,SA-3(1),SA-5,SA-8(19),SC-8(1),SC-28(3),SI-12}
|
* Mission sensitive information should be classified as Controlled Unclassified Information (CUI) or formally known as Sensitive but Unclassified. Ideally these artifacts would be rated SECRET or higher and stored on classified networks. Mission sensitive information can typically include a wide range of candidate material: the functional and performance specifications, the RF ICDs, databases, scripts, simulation and rehearsal results/reports, descriptions of uplink protection including any disabling/bypass features, failure/anomaly resolution, and any other sensitive information related to architecture, software, and flight/ground /mission operations. This could all need protection at the appropriate level (e.g., unclassified, SBU, classified, etc.) to mitigate levels of cyber intrusions that may be conducted against the project’s networks. Stand-alone systems and/or separate database encryption may be needed with controlled access and on-going Configuration Management to ensure changes in command procedures and critical database areas are tracked, controlled, and fully tested to avoid loss of science or the entire mission.
|
The [organization] shall protect the security plan from unauthorized disclosure and modification.{SV-MA-6}{AC-3,PL-2,PL-7}
|
|
In coordination with [organization], the [organization] shall prioritize and remediate flaws identified during security testing/evaluation.{CA-2,CA-5,SA-11,SI-3,SI-3(10)}
|
|
The [organization] shall implement a verifiable flaw remediation process into the developmental and operational configuration management process.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-2,CA-5,SA-3,SA-3(1),SA-11,SI-3,SI-3(10)}
|
The verifiable process should also include a cross reference to mission objectives and impact statements. Understanding the flaws discovered and how they correlate to mission objectives will aid in prioritization.
|
The [organization] shall verify that the scope of security testing/evaluation provides complete coverage of required security controls (to include abuse cases and penetration testing) at the depth of testing defined in the test documents.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-2,CA-8,RA-5(3),SA-11(5),SA-11(7)}
|
* The frequency of testing should be driven by Program completion events and updates.
* Examples of approaches are static analyses, dynamic analyses, binary analysis, or a hybrid of the three approaches
|
The [organization] shall maintain evidence of the execution of the security assessment plan and the results of the security testing/evaluation.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-2,CA-8,SA-11}
|
|
The [organization] shall create and implement a security assessment plan that includes: (1) The types of analyses, testing, evaluation, and reviews of all software and firmware components; (2) The degree of rigor to be applied to include abuse cases and/or penetration testing; and (3) The types of artifacts produced during those processes.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-2,CA-8,SA-11,SA-11(5)}
|
The security assessment plan should include evaluation of mission objectives in relation to the security of the mission. Assessments should not only be control based but also functional based to ensure mission is resilient against failures of controls.
|
The [organization] shall maintain an up-to-date Plan of Action and Milestones (POA&M) that identifies, assesses, prioritizes, and documents specific actions to be taken to correct deficiencies in the spacecraft's security posture.{CA-5}
|
|
The [organization] shall determine the vulnerabilities/weaknesses that require remediation, and coordinate the timeline for that remediation, in accordance with the analysis of the vulnerability scan report, the mission assessment of risk, and mission needs.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-5,CM-3,RA-5,RA-7,SI-3,SI-3(10)}
|
|
The [organization] shall implement, as part of an A&A process, a Continuous Monitoring Program (CMP) that evaluates the effectiveness of security control implementations on a recurring pre-defined basis.{CA-7,PM-31}
|
|
The [organization] shall coordinate penetration testing on mission critical spacecraft components (hardware and/or software).{SV-MA-4}{CA-8,CA-8(1),CP-4(5)}
|
Not all defects (i.e., buffer overflows, race conditions, and memory leaks) can be discovered statically and require execution of the system. This is where space-centric cyber testbeds (i.e., cyber ranges) are imperative as they provide an environment to maliciously attack components in a controlled environment to discover these undesirable conditions. Technology has improved to where digital twins for spacecraft are achievable, which provides an avenue for cyber testing that was often not performed due to perceived risk to the flight hardware.
|
The [organization] shall employ dynamic analysis (e.g.using simulation, penetration testing, fuzzing, etc.) to identify software/firmware weaknesses and vulnerabilities in developed and incorporated code (open source, commercial, or third-party developed code).{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-8,CM-10(1),RA-3(1),SA-11(5),SA-11(8),SA-11(9),SI-3,SI-7(10)}
|
|
The [organization] shall perform penetration testing/analysis: (1) On potential system elements before accepting the system; (2) As a realistic simulation of the active adversary’s known adversary tactics, techniques, procedures (TTPs), and tools; and (3) Throughout the lifecycle on physical and logical systems, elements, and processes.{SV-SP-3,SV-SP-4,SV-AV-7,SV-SP-11}{CA-8(1),SA-9,SA-11(5),SR-5(2)}
|
Penetration testing should be performed throughout the lifecycle on physical and logical systems, elements, and processes including: (1) Hardware, software, and firmware development processes; (2) Shipping/handling procedures; (3) Personnel and physical security programs; (4) Configuration management tools/measures to maintain provenance; and (5) Any other programs, processes, or procedures associated with the production/distribution of supply chain elements.
|
The [organization] shall maintain a list of suppliers and potential suppliers used, and the products that they supply to include software.{SV-SP-3,SV-SP-4,SV-SP-11}{CM-10,PL-8(2),PM-30,SA-8(9),SA-8(11)}
|
Ideally you have diversification with suppliers
|
The [organization] shall distribute documentation to only personnel with defined roles and a need to know.{SV-CF-3,SV-AV-5}{CM-12,CP-2,SA-5,SA-10}
|
Least privilege and need to know should be employed with the protection of all documentation. Documentation can contain sensitive information that can aid in vulnerability discovery, detection, and exploitation. For example, command dictionaries for ground and space systems should be handles with extreme care. Additionally, design documents for missions contain many key elements that if compromised could aid in an attacker successfully exploiting the system.
|
The [organization] shall test software and firmware updates related to flaw remediation for effectiveness and potential side effects on mission systems in a separate test environment before installation.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CM-3,CM-3(1),CM-3(2),CM-4(1),CM-4(2),CM-10(1),SA-8(31),SA-11(9),SI-2,SI-3,SI-3(10),SI-7(10),SI-7(12),SR-5(2)}
|
This requirement is focused on software and firmware flaws. If hardware flaw remediation is required, refine the requirement to make this clear.
|
The [organization] shall define processes and procedures to be followed when integrity verification tools detect unauthorized changes to software, firmware, and information.{SV-IT-2}{CM-3,CM-3(1),CM-3(5),CM-5(6),CM-6,CP-2,IR-6,IR-6(2),PM-30,SC-16(1),SC-51,SI-3,SI-4(7),SI-4(24),SI-7,SI-7(7),SI-7(10)}
|
|
The [organization] shall release updated versions of the mission information systems incorporating security-relevant software and firmware updates, after suitable regression testing, at a frequency no greater than [Program-defined frequency [90 days]].{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CM-3(2),CM-4(1)}
|
On-orbit patching/upgrades may be necessary if vulnerabilities are discovered after launch. The system should have the ability to update software post-launch.
|
The [organization] shall develop and implement anti-counterfeit policy and procedures designed to detect and prevent counterfeit components from entering the information system, including support tamper resistance and provide a level of protection against the introduction of malicious code or hardware.{SV-SP-3,SV-SP-4,SV-AV-7,SV-SP-11}{CM-3(8),CM-7(9),PM-30,SA-8(9),SA-8(11),SA-9,SA-10(3),SA-19,SC-51,SR-4(3),SR-4(4),SR-5(2),SR-11}
|
|
The [organization] shall prohibit the use of binary or machine-executable code from sources with limited or no warranty and without the provision of source code.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CM-7(8)}
|
|
The [organization] shall conduct a criticality analysis to identify mission critical functions and critical components and reduce the vulnerability of such functions and components through secure system design.{SV-SP-3,SV-SP-4,SV-AV-7,SV-MA-4}{CP-2,CP-2(8),PL-7,PM-11,PM-30(1),RA-3(1),RA-9,SA-8(9),SA-8(11),SA-8(25),SA-12,SA-14,SA-15(3),SC-7(29),SR-1}
|
During SCRM, criticality analysis will aid in determining supply chain risk. For mission critical functions/components, extra scrutiny must be applied to ensure supply chain is secured.
|
The [organization] shall employ techniques to limit harm from potential adversaries identifying and targeting the [organization]s supply chain.{CP-2,PM-30,SA-9,SA-12(5),SC-38,SR-3,SR-3(1),SR-3(2),SR-5(2)}
|
|
The [organization] shall employ Operations Security (OPSEC) safeguards to protect supply chain-related information for the system, system components, or system services. {CP-2(8),PM-30,SA-12(9),SC-38,SR-7}
|
|
The [organization] shall report counterfeit information system components to [organization] officials. {SV-SP-4}{IR-6,IR-6(2),PM-30,SA-19,SR-11}
|
|
The [organization] shall report identified systems or system components containing software affected by recently announced cybersecurity-related software flaws (and potential vulnerabilities resulting from those flaws) to [organization] officials with cybersecurity responsibilities.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-11}{IR-6,IR-6(2),SI-2,SI-3,SI-4(12),SR-4(4)}
|
|
The [organization] shall plan and coordinate security-related activities affecting the spacecraft with groups associated with systems from which the spacecraft is inheriting satisfaction of controls before conducting such activities in order to reduce the impact on other organizational entities.{SV-MA-6}{PL-2}
|
|
The [organization] shall develop a security plan for the spacecraft.{SV-MA-6}{PL-2,PL-7,PM-1,SA-8(29),SA-8(30)}
|
|
The [organization] shall use all-source intelligence analysis of suppliers and potential suppliers of the information system, system components, or system services to inform engineering, acquisition, and risk management decisions.{SV-SP-3,SV-SP-4,SV-AV-7,SV-SP-11}{PM-16,PM-30,RA-2,RA-3(1),RA-3(2),RA-7,SA-9,SA-12(8),SR-5(2)}
|
* The Program should also consider sub suppliers and potential sub suppliers.
* All-source intelligence of suppliers that the organization may use includes: (1) Defense Intelligence Agency (DIA) Threat Assessment Center (TAC), the enterprise focal point for supplier threat assessments for the DOD acquisition community risks; (2) Other U.S. Government resources including: (a) Government Industry Data Exchange Program (GIDEP) – Database where government and industry can record issues with suppliers, including counterfeits; and (b) System for Award Management (SAM) – Database of companies that are barred from doing business with the US Government.
|
The [organization] shall request threat analysis of suppliers of critical components and manage access to and control of threat analysis products containing U.S.person information.{SV-SP-3,SV-SP-4,SV-SP-11}{PM-16,PM-30(1),RA-3(1),SA-9,SA-12,SR-1}
|
The intent of this requirement is to address supply chain concerns on hardware and software vendors. Not required for trusted suppliers accredited to the Defense Microelectronic Activity (DMEA). If the Program intends to use a supplier not accredited by DMEA, the government customer should be notified as soon as possible. If the Program has internal processes to vet suppliers, it may meet this requirement. All software used and its origins must be included in the SBOM and be subjected to internal and Government vulnerability scans.
|
The [organization] shall use all-source intelligence analysis on threats to mission critical capabilities and/or system components to inform risk management decisions.{SV-MA-4}{PM-16,RA-3(2),RA-3(3),RA-7,RA-9,SA-12(8),SA-15(8)}
|
|
The [organization] shall conduct a supplier review prior to entering into a contractual agreement with a sub [organization] to acquire systems, system components, or system services.{PM-30,PM-30(1),RA-3(1),SA-8(9),SA-8(11),SA-9,SA-12(2),SR-5(2),SR-6}
|
|
The [organization] shall maintain documentation tracing the strategies, tools, and methods implemented to mitigate supply chain risk .{SV-SP-3,SV-SP-4,SV-AV-7}{PM-30,RA-3(1),SA-12(1),SR-5}
|
Examples include: (1) Transferring a portion of the risk to the developer or supplier through the use of contract language and incentives; (2) Using contract language that requires the implementation of SCRM throughout the system lifecycle in applicable contracts and other acquisition and assistance instruments (grants, cooperative agreements, Cooperative Research and Development Agreements (CRADAs), and other transactions). Within the DOD some examples include: (a) Language outlined in the Defense Acquisition Guidebook section 13.13. Contracting; (b) Language requiring the use of protected mechanisms to deliver elements and data about elements, processes, and delivery mechanisms; (c) Language that articulates that requirements flow down supply chain tiers to sub-prime suppliers. (3) Incentives for suppliers that: (a) Implement required security safeguards and SCRM best practices; (b) Promote transparency into their organizational processes and security practices; (c) Provide additional vetting of the processes and security practices of subordinate suppliers, critical information system components, and services; and (d) Implement contract to reduce SC risk down the contract stack. (4) Gaining insight into supplier security practices; (5) Using contract language and incentives to enable more robust risk management later in the lifecycle; (6) Using a centralized intermediary or “Blind Buy” approaches to acquire element(s) to hide actual usage locations from an untrustworthy supplier or adversary;
|
The [organization] shall protect against supply chain threats to the system, system components, or system services by employing security safeguards as defined by NIST SP 800-161 Rev.1.{SV-SP-3,SV-SP-4,SV-AV-7,SV-SP-11}{PM-30,RA-3(1),SA-8(9),SA-8(11),SA-12,SI-3,SR-1}
|
The chosen supply chain safeguards should demonstrably support a comprehensive, defense-in-breadth information security strategy. Safeguards should include protections for both hardware and software. Program should define their critical components (HW & SW) and identify the supply chain protections, approach/posture/process.
|
The [organization] shall conduct an assessment of risk prior to each milestone review [SRR\PDR\CDR], including the likelihood and magnitude of harm, from the unauthorized access, use, disclosure, disruption, modification, or destruction of the platform and the information it processes, stores, or transmits.{SV-MA-4}{RA-2,RA-3,SA-8(25)}
|
|
The [organization] shall document risk assessment results in [risk assessment report].{SV-MA-4}{RA-3}
|
|
The [organization] shall review risk assessment results [At least annually if not otherwise defined in formal organizational policy].{SV-MA-4}{RA-3}
|
|
The [organization] shall update the risk assessment [At least annually if not otherwise defined in formal institutional policy] or whenever there are significant changes to the information system or environment of operation (including the identification of new threats and vulnerabilities), or other conditions that may impact the security state of the spacecraft.{SV-MA-4}{RA-3}
|
|
The [organization] shall document risk assessment results in risk assessment report upon completion of each risk assessment.{RA-3,RA-7}
|
|
The [organization] shall use the threat and vulnerability analyses of the as-built system, system components, or system services to inform and direct subsequent testing/evaluation of the as-built system, component, or service.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-3(3),SA-11(2),SA-15(8),SI-3}
|
|
The [organization] shall ensure that the vulnerability scanning tools (e.g., static analysis and/or component analysis tools) used include the capability to readily update the list of potential information system vulnerabilities to be scanned.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-5,RA-5(1),RA-5(3),SI-3}
|
|
The [organization] shall perform vulnerability analysis and risk assessment of all systems and software.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-5,RA-5(3),SA-15(7),SI-3}
|
|
The [organization] shall ensure that vulnerability scanning tools and techniques are employed that facilitate interoperability among tools and automate parts of the vulnerability management process by using standards for: (1) Enumerating platforms, custom software flaws, and improper configurations; (2) Formatting checklists and test procedures; and (3) Measuring vulnerability impact.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-5,RA-5(3),SI-3}
|
Component/Origin scanning looks for open-source libraries/software that may be included into the baseline and looks for known vulnerabilities and open-source license violations.
|
The [organization] shall perform static source code analysis for all available source code looking for [[organization]-defined Top CWE List] weaknesses using complimentary set of static code analysis tools (i.e.more than one).{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-5,SA-11(1),SA-15(7)}
|
|
The [organization] shall analyze vulnerability/weakness scan reports and results from security control assessments.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-5,SI-3}
|
|
The [organization] shall ensure that the list of potential system vulnerabilities scanned is updated [prior to a new scan] {SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-5(2),SI-3}
|
|
The [organization] shall perform configuration management during system, component, or service during [design; development; implementation; operations].{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-10}
|
|
The [organization] shall review proposed changes to the spacecraft, assessing both mission and security impacts.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-10,CM-3(2)}
|
|
The [organization] shall correct flaws identified during security testing/evaluation.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11}
|
Flaws that impact the mission objectives should be prioritized.
|
The [organization] shall perform [Selection (one or more): unit; integration; system; regression] testing/evaluation at [Program-defined depth and coverage].{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11}
|
The depth needs to include functional testing as well as negative/abuse testing.
|
The [organization] shall create prioritized list of software weakness classes (e.g., Common Weakness Enumerations) to be used during static code analysis for prioritization of static analysis results.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11(1),SA-15(7)}
|
The prioritized list of CWEs should be created considering operational environment, attack surface, etc. Results from the threat modeling and attack surface analysis should be used as inputs into the CWE prioritization process. There is also a CWSS (https://cwe.mitre.org/cwss/cwss_v1.0.1.html) process that can be used to prioritize CWEs. The prioritized list of CWEs can help with tools selection as well as you select tools based on their ability to detect certain high priority CWEs.
|
The [organization] shall use threat modeling and vulnerability analysis to inform the current development process using analysis from similar systems, components, or services where applicable.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11(2),SA-15(8)}
|
|
The [organization] shall perform and document threat and vulnerability analyses of the as-built system, system components, or system services.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11(2),SI-3}
|
|
The [organization] shall perform a manual code review of all flight code.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11(4)}
|
|
The [organization] shall conduct an Attack Surface Analysis and reduce attack surfaces to a level that presents a low level of compromise by an attacker.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11(6),SA-15(5)}
|
|
The [organization] shall define acceptable coding languages to be used by the software developer.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-15}
|
|
The [organization] shall define acceptable secure coding standards for use by the software developers.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-15}
|
|
The [organization] shall have automated means to evaluate adherence to coding standards.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-15,SA-15(7),RA-5}
|
Manual review cannot scale across the code base; you must have a way to scale in order to confirm your coding standards are being met. The intent is for automated means to ensure code adheres to a coding standard.
|
The [organization] shall perform component analysis (a.k.a.origin analysis) for developed or acquired software.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-15(7),RA-5}
|
|
The [organization] shall document the spacecraft's security architecture, and how it is established within and is an integrated part of the Program's mission security architecture.{SV-MA-6}{SA-17}
|
|
The [organization] shall protect documentation and Essential Elements of Information (EEI) as required, in accordance with the risk management strategy.{SV-CF-3,SV-AV-5}{SA-5}
|
Essential Elements of Information (EEI):
|
The [organization] shall correct reported cybersecurity-related information system flaws.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SI-2}
|
* Although this requirement is stated to specifically apply to cybersecurity-related flaws, the Program office may choose to broaden it to all SV flaws.
* This requirement is allocated to the Program, as it is presumed, they have the greatest knowledge of the components of the system and when identified flaws apply.
|
The [organization] shall identify, report, and coordinate correction of cybersecurity-related information system flaws.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SI-2}
|
|
If using the Government Microelectronics Assessment for Trust (GOMAT) framework outright, to perform ASIC and FPGA threat/vulnerability risk assessment, the following requirements would apply: {SV-SP-5}{SR-1,SR-5}
|
• 1.g “In coordination with the DOD CIO, the Director, Defense Intelligence Agency (DIA), and the Heads of the DOD Components, develop a strategy for managing risk in the supply chain for integrated circuit-related products and services (e.g., FPGAs, printed circuit boards) that are identifiable to the supplier as specifically created or modified for DOD (e.g., military temperature range, radiation hardened).
|
The [organization] shall develop and implement anti-counterfeit policy and procedures, in coordination with the [CIO], that is demonstrably consistent with the anti-counterfeit policy defined by the Program office.{SV-SP-4,SV-SP-11}{SR-11}
|
|
The [organization] shall employ [organization]-defined techniques to limit harm from potential adversaries identifying and targeting the Program supply chain.{SV-SP-3,SV-SP-4,SV-AV-7,SV-SP-11}{SR-3(2),SC-38}
|
Examples of security safeguards that the organization should consider implementing to limit the harm from potential adversaries targeting the organizational supply chain, are: (1) Using trusted physical delivery mechanisms that do not permit access to the element during delivery (ship via a protected carrier, use cleared/official couriers, or a diplomatic pouch); (2) Using trusted electronic delivery of products and services (require downloading from approved, verification-enhanced sites); (3) Avoiding the purchase of custom configurations, where feasible; (4) Using procurement carve outs (i.e., exclusions to commitments or obligations), where feasible; (5) Using defensive design approaches; (6) Employing system OPSEC principles; (7) Employing a diverse set of suppliers; (8) Employing approved vendor lists with standing reputations in industry; (9) Using a centralized intermediary and “Blind Buy” approaches to acquire element(s) to hide actual usage locations from an untrustworthy supplier or adversary Employing inventory management policies and processes; (10) Using flexible agreements during each acquisition and procurement phase so that it is possible to meet emerging needs or requirements to address supply chain risk without requiring complete revision or re-competition of an acquisition or procurement; (11) Using international, national, commercial or government standards to increase potential supply base; (12) Limiting the disclosure of information that can become publicly available; and (13) Minimizing the time between purchase decisions and required delivery.
|
The [organization] shall employ the [organization]-defined approaches for the purchase of the system, system components, or system services from suppliers.{SV-SP-3,SV-SP-4,SV-AV-7,SV-SP-11}{SR-5}
|
This could include tailored acquisition strategies, contract tools, and procurement methods.
|
The [organization] (and Prime Contractor) shall conduct a supplier review prior to entering into a contractual agreement with a contractor (or sub-contractor) to acquire systems, system components, or system services.{SV-SP-3,SV-SP-4,SV-AV-7,SV-SP-11}{SR-6}
|
|
The [organization] shall employ [Selection (one or more): independent third-party analysis, Program penetration testing, independent third-party penetration testing] of [Program-defined supply chain elements, processes, and actors] associated with the system, system components, or system services.{SV-SP-3,SV-SP-4,SV-AV-7,SV-SP-11}{SR-6(1)}
|
|
The [organization] shall employ [Program-defined Operations Security (OPSEC) safeguards] to protect supply chain-related information for the system, system components, or system services.{SV-SP-3,SV-SP-4,SV-AV-7,SV-SP-11}{SR-7,SC-38,CP-2(8)}
|
OPSEC safeguards may include: (1) Limiting the disclosure of information needed to design, develop, test, produce, deliver, and support the element for example, supplier identities, supplier processes, potential suppliers, security requirements, design specifications, testing and evaluation result, and system/component configurations, including the use of direct shipping, blind buys, etc.; (2) Extending supply chain awareness, education, and training for suppliers, intermediate users, and end users; (3) Extending the range of OPSEC tactics, techniques, and procedures to potential suppliers, contracted suppliers, or sub-prime contractor tier of suppliers; and (4) Using centralized support and maintenance services to minimize direct interactions between end users and original suppliers.
|
For FPGA pre-silicon artifacts that are developed, coded, and tested by a developer that is not accredited, the [organization] shall be subjected to a development environment and pre-silicon artifacts risk assessment by [organization]. Based on the results of the risk assessment, the [organization] may need to implement protective measures or other processes to ensure the integrity of the FPGA pre-silicon artifacts.{SV-SP-5}{SA-3,SA-3(1),SA-8(9),SA-8(11),SA-12,SA-12(1),SR-1,SR-5}
|
DOD-I-5200.44 requires the following:
4.c.2 “Control the quality, configuration, and security of software, firmware, hardware, and systems throughout their lifecycles... Employ protections that manage risk in the supply chain… (e.g., integrated circuits, field-programmable gate arrays (FPGA), printed circuit boards) when they are identifiable (to the supplier) as having a DOD end-use. “ 4.e “In applicable systems, integrated circuit-related products and services shall be procured from a Trusted supplier accredited by the Defense Microelectronics Activity (DMEA) when they are custom-designed, custommanufactured, or tailored for a specific DOD military end use (generally referred to as application-specific integrated circuits (ASIC)). “ 1.g “In coordination with the DOD CIO, the Director, Defense Intelligence Agency (DIA), and the Heads of the DOD Components, develop a strategy for managing risk in the supply chain for integrated circuit-related products and services (e.g., FPGAs, printed circuit boards) that are identifiable to the supplier as specifically created or modified for DOD (e.g., military temperature range, radiation hardened).
|
Any EEEE or mechanical piece parts that cannot be procured from the OCM or their authorized distribution network shall be approved and the government program office notified to prevent and detect counterfeit and fraudulent parts and materials.{SV-SP-5}{SA-8(9),SA-8(11),SA-12,SA-12(1),SR-1,SR-5}
|
The Program, working with the contractors, shall identify which ASICs/FPGAs perform or execute an integral part of mission critical functions and if the supplier is accredited “Trusted” by DMEA. If the contractor is not accredited by DMEA, then the Program may apply various of the below ASIC/FPGA assurance requirements to the contractor, and the Program may need to perform a risk assessment of the contractor’s design environment.
|
For ASICs that are designed, developed, manufactured, packaged, or tested by a supplier that is not DMEA accredited, the ASIC development shall undergo a threat/vulnerability risk assessment. Based on the results of the risk assessment, the [organization] may need to implement protective measures or other processes to ensure the integrity of the ASIC.{SV-SP-5}{SA-8(9),SA-8(11),SA-8(21),SA-12,SA-12(1),SR-1,SR-4(4),SR-5}
|
DOD-I-5200.44 requires the following:
4.c.2 “Control the quality, configuration, and security of software, firmware, hardware, and systems throughout their lifecycles... Employ protections that manage risk in the supply chain… (e.g., integrated circuits, field-programmable gate arrays (FPGA), printed circuit boards) when they are identifiable (to the supplier) as having a DOD end-use. “ 4.e “In applicable systems, integrated circuit-related products and services shall be procured from a Trusted supplier accredited by the Defense Microelectronics Activity (DMEA) when they are custom-designed, custommanufactured, or tailored for a specific DOD military end use (generally referred to as application-specific integrated circuits (ASIC)). “ 1.g “In coordination with the DOD CIO, the Director, Defense Intelligence Agency (DIA), and the Heads of the DOD Components, develop a strategy for managing risk in the supply chain for integrated circuit-related products and services (e.g., FPGAs, printed circuit boards) that are identifiable to the supplier as specifically created or modified for DOD (e.g., military temperature range, radiation hardened).
|
Any EEEE or mechanical piece parts that cannot be procured from the OCM or their authorized franchised distribution network shall be approved by the [organization]’s Parts, Materials and Processes Control Board (PMPCB) as well as the government program office to prevent and detect counterfeit and fraudulent parts and materials.{SV-SP-5}{SR-1,SR-5}
|
The Program, working with the contractors, shall identify which ASICs/FPGAs perform or execute an integral part of mission critical functions and if the supplier is accredited “Trusted” by DMEA. If the contractor is not accredited by DMEA, then the Program may apply various of the below ASIC/FPGA assurance requirements to the contractor, and the Program may need to perform a risk assessment of the contractor’s design environment.
|
For ASICs that are designed, developed, manufactured, packaged, or tested by a supplier that is NOT DMEA accredited Trusted, the ASIC development shall undergo a threat/vulnerability risk assessment.The assessment shall use Aerospace security guidance and requirements tailored from TOR-2019-00506 Vol.2, and TOR-2019-02543 ASIC and FPGA Risk Assessment Process and Checklist.Based on the results of the risk assessment, the Program may require the developer to implement protective measures or other processes to ensure the integrity of the ASIC.{SV-SP-5}{SR-1,SR-5}
|
DOD-I-5200.44 requires the following:
4.c.2 “Control the quality, configuration, and security of software, firmware, hardware, and systems throughout their lifecycles... Employ protections that manage risk in the supply chain… (e.g., integrated circuits, field-programmable gate arrays (FPGA), printed circuit boards) when they are identifiable (to the supplier) as having a DOD end-use. “ 4.e “In applicable systems, integrated circuit-related products and services shall be procured from a Trusted supplier accredited by the Defense Microelectronics Activity (DMEA) when they are custom-designed, custommanufactured, or tailored for a specific DOD military end use (generally referred to as application-specific integrated circuits (ASIC)). “ 1.g “In coordination with the DOD CIO, the Director, Defense Intelligence Agency (DIA), and the Heads of the DOD Components, develop a strategy for managing risk in the supply chain for integrated circuit-related products and services (e.g., FPGAs, printed circuit boards) that are identifiable to the supplier as specifically created or modified for DOD (e.g., military temperature range, radiation hardened).
|
For FPGA pre-silicon artifacts that are developed, coded, and tested by a developer that is NOT DMEA accredited Trusted, the contractor/developer shall be subjected to a development environment and pre-silicon artifacts risk assessment by the Program.The assessment shall use Aerospace security guidance and requirements in TOR-2019-00506 Vol.2, and TOR-2019-02543 ASIC and FPGA Risk Assessment Process and Checklist.Based on the results of the risk assessment, the Program may require the developer to implement protective measures or other processes to ensure the integrity of the FPGA pre-silicon artifacts.{SV-SP-5}{SR-1,SR-5}
|
DOD-I-5200.44 requires the following:
4.c.2 “Control the quality, configuration, and security of software, firmware, hardware, and systems throughout their lifecycles... Employ protections that manage risk in the supply chain… (e.g., integrated circuits, field-programmable gate arrays (FPGA), printed circuit boards) when they are identifiable (to the supplier) as having a DOD end-use. “ 4.e “In applicable systems, integrated circuit-related products and services shall be procured from a Trusted supplier accredited by the Defense Microelectronics Activity (DMEA) when they are custom-designed, custommanufactured, or tailored for a specific DOD military end use (generally referred to as application-specific integrated circuits (ASIC)). “ 1.g “In coordination with the DOD CIO, the Director, Defense Intelligence Agency (DIA), and the Heads of the DOD Components, develop a strategy for managing risk in the supply chain for integrated circuit-related products and services (e.g., FPGAs, printed circuit boards) that are identifiable to the supplier as specifically created or modified for DOD (e.g., military temperature range, radiation hardened).
|
The [organization] shall ensure that the contractors/developers have all ASICs designed, developed, manufactured, packaged, and tested by suppliers with a Defense Microelectronics Activity (DMEA) Trust accreditation.{SV-SP-5}{SR-1,SR-5}
|
|
The [organization] shall ensure that the contractors/developers have all EEEE, and mechanical piece parts procured from the Original Component Manufacturer (OCM) or their authorized franchised distribution network.{SV-SP-5}{SR-1,SR-5}
|
These requirements might only make sense for ASIC/FPGA that are deemed to support mission critical functions. The Program has the responsibility to identify all ASICs and FPGAs that are used in all flight hardware by each hardware element. This list must include all contractor and subcontractor usage of ASICs and FPGAs.
|
The [organization] shall use a DMEA certified environment to develop, code and test executable software (firmware or bit-stream) that will be programmed into a one-time programmable FPGA or be programmed into non-volatile memory (NVRAM) that the FPGA executes.{SV-SP-5}{SR-1,SR-5}
|
DOD-I-5200.44 requires the following:
4.c.2 “Control the quality, configuration, and security of software, firmware, hardware, and systems throughout their lifecycles... Employ protections that manage risk in the supply chain… (e.g., integrated circuits, field-programmable gate arrays (FPGA), printed circuit boards) when they are identifiable (to the supplier) as having a DOD end-use. “ 4.e “In applicable systems, integrated circuit-related products and services shall be procured from a Trusted supplier accredited by the Defense Microelectronics Activity (DMEA) when they are custom-designed, custommanufactured, or tailored for a specific DOD military end use (generally referred to as application-specific integrated circuits (ASIC)). “ 1.g “In coordination with the DOD CIO, the Director, Defense Intelligence Agency (DIA), and the Heads of the DOD Components, develop a strategy for managing risk in the supply chain for integrated circuit-related products and services (e.g., FPGAs, printed circuit boards) that are identifiable to the supplier as specifically created or modified for DOD (e.g., military temperature range, radiation hardened).
|
The [organization] should have requirements/controls for all ground/terrestrial systems covering: Data Protection, Ground Software, Endpoints, Networks, Computer Network Defense / Incident Response, Perimeter Security, Physical Controls, and Prevention Program (SSP, PPP, and Training).See NIST 800-53 and CNSSI 1253 for guidance on ground security {SV-MA-7}
|
|
The [organization] shall ensure reused TT&C software has adequate uniqueness for command decoders/dictionaries so that commands are received by only the intended satellite.{SV-SP-6}{AC-17(10),SC-16(3),SI-3(9)}
|
The goal is to eliminate risk that compromise of one command database does not affect a different one due to reuse. The intent is to ensure that one SV can not process the commands from another SV. Given the crypto setup with keys and VCC needing to match, this requirement may be inherently met as a result of using type-1 cryptography. The intent is not to recreate entire command dictionaries but have enough uniqueness in place that it prevents a SV from receiving a rogue command. As long as there is some uniqueness at the receiving end of the commands, that is adequate.
|
The [organization] shall ensure that the allocated security safeguards operate in a coordinated and mutually reinforcing manner.{SV-MA-6}{CA-7(5),PL-7,PL-8(1),SA-8(19)}
|
|
The [organization] shall document and design a security architecture using a defense-in-depth approach that allocates the [organization]s defined safeguards to the indicated locations and layers: [Examples include: operating system abstractions and hardware mechanisms to the separate processors in the platform, internal components, and the FSW].{SV-MA-6}{CA-9,PL-7,PL-8,PL-8(1),SA-8(3),SA-8(4),SA-8(7),SA-8(9),SA-8(11),SA-8(13),SA-8(19),SA-8(29),SA-8(30)}
|
|
The [spacecraft] shall prevent the installation of Flight Software without verification that the component has been digitally signed using a certificate that is recognized and approved by the ground.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-9}{CM-3,CM-3(8),CM-5,CM-5(3),CM-14,SA-8(8),SA-8(31),SA-10(2),SI-3,SI-7(12),SI-7(15)}
|
|
The [organization] shall ensure that software planned for reuse meets the fit, form, and function, and security as a component within the new application.{SV-SP-6,SV-SP-7,SV-SP-11}{CM-7(5)}
|
|
The [organization] shall implement a security architecture and design that provides the required security functionality, allocates security controls among physical and logical components, and integrates individual security functions, mechanisms, and processes together to provide required security capabilities and a unified approach to protection.{SV-MA-6}{PL-7,SA-2,SA-8,SA-8(1),SA-8(2),SA-8(3),SA-8(4),SA-8(5),SA-8(6),SA-8(7),SA-8(9),SA-8(11),SA-8(13),SA-8(19),SA-8(29),SA-8(30),SC-32,SC-32(1)}
|
|
The [spacecraft] shall have on-board intrusion detection/prevention system that monitors the mission critical components or systems.{SV-AC-1,SV-AC-2,SV-MA-4}{RA-10,SC-7,SI-3,SI-3(8),SI-4,SI-4(1),SI-4(7),SI-4(13),SI-4(24),SI-4(25),SI-10(6)}
|
The mission critical components or systems could be GNC/Attitude Control, C&DH, TT&C, Fault Management.
|
The [spacecraft] boot firmware must validate the boot loader, boot configuration file, and operating system image, in that order, against their respective signatures.{SV-IT-3}{SA-8(10),SA-8(11),SA-8(12),SI-7(9),SI-7(10)}
|
A signature is ~770 bits long. No requirement is imposed on the storage location of signatures.
|
The [spacecraft] boot firmware must verify a trust chain that extends through the hardware root of trust, boot loader, boot configuration file, and operating system image, in that order.{SV-IT-3}{SA-8(10),SA-8(11),SA-8(12),SI-7(9),SI-7(10)}
|
These three items were chosen because they’re intended to be static values (once properly set up) but are in volatile storage. Also, the Boot ROM can’t be modified, so there’s no reason to check a signature.
|
The [spacecraft] shall perform attestation at each stage of startup and ensure overall trusted boot regime (i.e., root of trust).{SV-IT-3}{SA-8(10),SA-8(11),SA-8(12),SI-7(9),SI-7(10),SI-7(17)}
|
It is important for the computing module to be able to access a set of functions and commands that it trusts; that is, that it knows to be true. This concept is referred to as root of trust (RoT) and should be included in the spacecraft design. With RoT, a device can always be trusted to operate as expected. RoT functions, such as verifying the device’s own code and configuration, must be implemented in secure hardware (i.e., field programmable gate arrays). By checking the security of each stage of power-up, RoT devices form the first link in a chain of trust that protects the spacecraft
|
The [spacecraft] hardware root of trust must be an ECDSA NIST P-384 public key.{SV-IT-3}{SI-7(9)}
|
No requirement is imposed on uniqueness.
|
The [spacecraft] hardware root of trust must be loadable only once, post-purchase.{SV-IT-3}{SI-7(9)}
|
No requirement is imposed on preventing hardware readout. The public key belongs to the customer, not the manufacturer, so it must be loaded after purchase. Also, if it can be overwritten, there’s no reason to trust it.
|
The [spacecraft] shall implement trusted boot/RoT as a separate compute engine controlling the trusted computing platform cryptographic processor.{SV-IT-3}{SI-7(9)}
|
|
The [spacecraft] shall implement trusted boot/RoT computing module on radiation tolerant burn-in (non-programmable) equipment.{SV-IT-3}{SI-7(9)}
|
|
The [spacecraft] boot firmware must enter a recovery routine upon failing to verify signed data in the trust chain, and not execute or trust that signed data.{SV-IT-3}{SI-7(9),SI-7(10)}
|
No other requirements are imposed on the recovery routine besides not using the failed data. Unverifiable data isn’t trusted and shouldn’t be run.
|
The [spacecraft] secure boot mechanism shall be Commercial National Security Algorithm Suite (CNSA) compliant.{SV-IT-3}{SI-7(9),SI-7(10)}
|
No certification process is required (or exists). The CNSA is easy to meet, only restricts algorithm choice, and aids ease-of-use for government customers.
|
The [spacecraft] shall allocate enough boot ROM memory for secure boot firmware execution.{SV-IT-3}{SI-7(9),SI-7(10)}
|
|
The [spacecraft] shall allocate enough SRAM memory for secure boot firmware execution.{SV-IT-3}{SI-7(9),SI-7(10)}
|
|
The [spacecraft] shall support the algorithmic construct of Elliptic Curve Digital Signature Algorithm (ECDSA) NIST P-384 + SHA-38 or equivalent strength.{SV-IT-3}{SI-7(9),SI-7(10)}
|
Timing data may suggest cryptographic accelerators are unnecessary. This construct was chosen because (a) it’s in the CNSA suite and (b) it doesn’t require secret values to be stored
|