The [organization] shall implement a verifiable flaw remediation process into the developmental and operational configuration management process.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-2,CA-5,SA-3,SA-3(1),SA-11,SI-3,SI-3(10)}
|
The verifiable process should also include a cross reference to mission objectives and impact statements. Understanding the flaws discovered and how they correlate to mission objectives will aid in prioritization.
|
The [organization] shall verify that the scope of security testing/evaluation provides complete coverage of required security controls (to include abuse cases and penetration testing) at the depth of testing defined in the test documents.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-2,CA-8,RA-5(3),SA-11(5),SA-11(7)}
|
* The frequency of testing should be driven by Program completion events and updates.
* Examples of approaches are static analyses, dynamic analyses, binary analysis, or a hybrid of the three approaches
|
The [organization] shall maintain evidence of the execution of the security assessment plan and the results of the security testing/evaluation.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-2,CA-8,SA-11}
|
|
The [organization] shall create and implement a security assessment plan that includes: (1) The types of analyses, testing, evaluation, and reviews of all software and firmware components; (2) The degree of rigor to be applied to include abuse cases and/or penetration testing; and (3) The types of artifacts produced during those processes.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-2,CA-8,SA-11,SA-11(5)}
|
The security assessment plan should include evaluation of mission objectives in relation to the security of the mission. Assessments should not only be control based but also functional based to ensure mission is resilient against failures of controls.
|
The [organization] shall determine the vulnerabilities/weaknesses that require remediation, and coordinate the timeline for that remediation, in accordance with the analysis of the vulnerability scan report, the mission assessment of risk, and mission needs.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-5,CM-3,RA-5,RA-7,SI-3,SI-3(10)}
|
|
The [organization] shall employ dynamic analysis (e.g.using simulation, penetration testing, fuzzing, etc.) to identify software/firmware weaknesses and vulnerabilities in developed and incorporated code (open source, commercial, or third-party developed code).{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CA-8,CM-10(1),RA-3(1),SA-11(5),SA-11(8),SA-11(9),SI-3,SI-7(10)}
|
|
The [organization] shall test software and firmware updates related to flaw remediation for effectiveness and potential side effects on mission systems in a separate test environment before installation.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CM-3,CM-3(1),CM-3(2),CM-4(1),CM-4(2),CM-10(1),SA-8(31),SA-11(9),SI-2,SI-3,SI-3(10),SI-7(10),SI-7(12),SR-5(2)}
|
This requirement is focused on software and firmware flaws. If hardware flaw remediation is required, refine the requirement to make this clear.
|
The [organization] shall release updated versions of the mission information systems incorporating security-relevant software and firmware updates, after suitable regression testing, at a frequency no greater than [Program-defined frequency [90 days]].{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CM-3(2),CM-4(1)}
|
On-orbit patching/upgrades may be necessary if vulnerabilities are discovered after launch. The system should have the ability to update software post-launch.
|
The [organization] shall prohibit the use of binary or machine-executable code from sources with limited or no warranty and without the provision of source code.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{CM-7(8)}
|
|
The [organization] shall report identified systems or system components containing software affected by recently announced cybersecurity-related software flaws (and potential vulnerabilities resulting from those flaws) to [organization] officials with cybersecurity responsibilities.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-11}{IR-6,IR-6(2),SI-2,SI-3,SI-4(12),SR-4(4)}
|
|
The [organization] shall use the threat and vulnerability analyses of the as-built system, system components, or system services to inform and direct subsequent testing/evaluation of the as-built system, component, or service.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-3(3),SA-11(2),SA-15(8),SI-3}
|
|
The [organization] shall ensure that the vulnerability scanning tools (e.g., static analysis and/or component analysis tools) used include the capability to readily update the list of potential information system vulnerabilities to be scanned.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-5,RA-5(1),RA-5(3),SI-3}
|
|
The [organization] shall perform vulnerability analysis and risk assessment of all systems and software.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-5,RA-5(3),SA-15(7),SI-3}
|
|
The [organization] shall ensure that vulnerability scanning tools and techniques are employed that facilitate interoperability among tools and automate parts of the vulnerability management process by using standards for: (1) Enumerating platforms, custom software flaws, and improper configurations; (2) Formatting checklists and test procedures; and (3) Measuring vulnerability impact.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-5,RA-5(3),SI-3}
|
Component/Origin scanning looks for open-source libraries/software that may be included into the baseline and looks for known vulnerabilities and open-source license violations.
|
The [organization] shall perform static source code analysis for all available source code looking for [[organization]-defined Top CWE List] weaknesses using complimentary set of static code analysis tools (i.e.more than one).{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-5,SA-11(1),SA-15(7)}
|
|
The [organization] shall analyze vulnerability/weakness scan reports and results from security control assessments.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-5,SI-3}
|
|
The [organization] shall ensure that the list of potential system vulnerabilities scanned is updated [prior to a new scan] {SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{RA-5(2),SI-3}
|
|
The [organization] shall perform configuration management during system, component, or service during [design; development; implementation; operations].{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-10}
|
|
The [organization] shall review proposed changes to the spacecraft, assessing both mission and security impacts.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-10,CM-3(2)}
|
|
The [organization] shall correct flaws identified during security testing/evaluation.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11}
|
Flaws that impact the mission objectives should be prioritized.
|
The [organization] shall perform [Selection (one or more): unit; integration; system; regression] testing/evaluation at [Program-defined depth and coverage].{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11}
|
The depth needs to include functional testing as well as negative/abuse testing.
|
The [organization] shall create prioritized list of software weakness classes (e.g., Common Weakness Enumerations) to be used during static code analysis for prioritization of static analysis results.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11(1),SA-15(7)}
|
The prioritized list of CWEs should be created considering operational environment, attack surface, etc. Results from the threat modeling and attack surface analysis should be used as inputs into the CWE prioritization process. There is also a CWSS (https://cwe.mitre.org/cwss/cwss_v1.0.1.html) process that can be used to prioritize CWEs. The prioritized list of CWEs can help with tools selection as well as you select tools based on their ability to detect certain high priority CWEs.
|
The [organization] shall use threat modeling and vulnerability analysis to inform the current development process using analysis from similar systems, components, or services where applicable.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11(2),SA-15(8)}
|
|
The [organization] shall perform and document threat and vulnerability analyses of the as-built system, system components, or system services.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11(2),SI-3}
|
|
The [organization] shall perform a manual code review of all flight code.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11(4)}
|
|
The [organization] shall conduct an Attack Surface Analysis and reduce attack surfaces to a level that presents a low level of compromise by an attacker.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-11(6),SA-15(5)}
|
|
The [organization] shall define acceptable coding languages to be used by the software developer.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-15}
|
|
The [organization] shall define acceptable secure coding standards for use by the software developers.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-15}
|
|
The [organization] shall have automated means to evaluate adherence to coding standards.{SV-SP-1,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-15,SA-15(7),RA-5}
|
Manual review cannot scale across the code base; you must have a way to scale in order to confirm your coding standards are being met. The intent is for automated means to ensure code adheres to a coding standard.
|
The [organization] shall perform component analysis (a.k.a.origin analysis) for developed or acquired software.{SV-SP-1,SV-SP-2,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SA-15(7),RA-5}
|
|
The [organization] shall correct reported cybersecurity-related information system flaws.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SI-2}
|
* Although this requirement is stated to specifically apply to cybersecurity-related flaws, the Program office may choose to broaden it to all SV flaws.
* This requirement is allocated to the Program, as it is presumed, they have the greatest knowledge of the components of the system and when identified flaws apply.
|
The [organization] shall identify, report, and coordinate correction of cybersecurity-related information system flaws.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-7,SV-SP-9,SV-SP-11}{SI-2}
|
|
The [organization] shall ensure reused TT&C software has adequate uniqueness for command decoders/dictionaries so that commands are received by only the intended satellite.{SV-SP-6}{AC-17(10),SC-16(3),SI-3(9)}
|
The goal is to eliminate risk that compromise of one command database does not affect a different one due to reuse. The intent is to ensure that one SV can not process the commands from another SV. Given the crypto setup with keys and VCC needing to match, this requirement may be inherently met as a result of using type-1 cryptography. The intent is not to recreate entire command dictionaries but have enough uniqueness in place that it prevents a SV from receiving a rogue command. As long as there is some uniqueness at the receiving end of the commands, that is adequate.
|
The [spacecraft] shall prevent the installation of Flight Software without verification that the component has been digitally signed using a certificate that is recognized and approved by the ground.{SV-SP-1,SV-SP-3,SV-SP-6,SV-SP-9}{CM-3,CM-3(8),CM-5,CM-5(3),CM-14,SA-8(8),SA-8(31),SA-10(2),SI-3,SI-7(12),SI-7(15)}
|
|
The [organization] shall ensure that software planned for reuse meets the fit, form, and function, and security as a component within the new application.{SV-SP-6,SV-SP-7,SV-SP-11}{CM-7(5)}
|
|