Identifies unauthorized activation of hardware debugging features, which could facilitate backdoor access. This pattern checks if the hardware debugging mode is activated at an unexpected time (activation_time != 'expected_time'). It is useful for scenarios where debugging activity might occur outside of predefined operational windows, potentially signaling malicious activity or tampering.
| ID | Name | Description | |
| IA-0001.03 | Hardware Supply Chain | Adversaries alter boards, modules, or programmable logic prior to delivery to create latent access or reliability sabotage. Tactics include inserting hardware Trojans in ASIC/FPGA designs, modifying bitstreams or disabling security fuses, leaving debug interfaces (JTAG/SWD/UART) active, substituting near-spec counterfeits, or embedding parts that fail after specific environmental or temporal conditions (“time-bomb” components). Other avenues target programming stations and “golden” images so entire lots inherit the same weakness. Microcontroller boot configurations, peripheral EEPROMs, and supervisory controllers are common leverage points because small changes there can reshape trust boundaries across the bus. The effect is a platform that behaves nominally through acceptance test yet enables covert control, targeted degradation, or delayed failure once on orbit. | |
| IA-0012 | Assembly, Test, and Launch Operation Compromise | Assembly, Test, and Launch Operation (ATLO) concentrates people, tools, and authority while components first exchange real traffic across flight interfaces. Test controllers, EGSE, simulators, flatsats, loaders, and data recorders connect to the same buses and command paths that will exist on orbit. Threat actors exploit this density and dynamism: compromised laptops or transient cyber assets push images and tables; lab networks bridge otherwise separate enclaves; vendor support accounts move software between staging and flight hardware; and “golden” artifacts created or modified in ATLO propagate into the as-flown baseline. Malware can traverse shared storage and scripting environments, ride update/checklist execution, or piggyback on protocol translators and gateways used to stimulate subsystems. Because ATLO often introduces late firmware loads, key/counter initialization, configuration freezes, and full-system rehearsals, a single well-placed change can yield first execution on multiple devices and persist into LEOP. | |
| EX-0005 | Exploit Hardware/Firmware Corruption | The adversary achieves execution or effect by corrupting or steering behavior beneath the software stack, in device firmware, programmable logic, or the hardware itself. Examples include tampering with firmware images or configuration blobs burned into non-volatile memory; targeting MCU/SoC boot ROM fallbacks; editing FPGA bitstreams or partial-reconfiguration frames; or leveraging physical phenomena and timing to flip bits or skip checks. Because these actions occur below or alongside the operating system and application FSW, traditional endpoint safeguards see normal interfaces while trust anchors are already altered. | |
| EX-0005.01 | Design Flaws | Threat actors may exploit inherent properties or errata in the hardware/logic design rather than injecting new code. Levers include undocumented or weakly specified behaviors (scan chains, test modes, debug straps), counter/timer rollovers and wraparound, interrupt storms and priority inversions, MMU/TLB corner cases, DMA engines that can write outside intended buffers, and bus arbitration or clock-domain crossing issues that permit stale or reordered writes. RNGs and crypto accelerators with flawed seeding or side-channel leakage can expose secrets or enable predictable authentication values. In programmable logic, vulnerable state machines, insufficient reset paths, and hazardous partial-reconfiguration regions create opportunities to drive the design into privileged or undefined states. Even reliability features can be turned: hardware timers intended for liveness can be paced to starve control loops; ECC policies can be nudged so correction conceals attacker-induced drift. The common thread is using the platform’s own guarantees, timing, priority, persistence, or fault handling, to cause privileged behavior that the software stack accepts as “by design.” | |
| PER-0002.01 | Hardware Backdoor | Hardware backdoors leverage properties of the physical design to provide durable, low-visibility reentry. Examples include enabled test/scan chains, manufacturing or boot-strap modes invoked by pins or registers, persistent debug interfaces (JTAG/SWD/UART), undocumented device commands, and logic inserted in FPGA/ASIC designs that activates under specific stimuli. Because these mechanisms sit below or beside flight software, they can grant direct access to buses, memories, or peripheral control even when higher layers appear healthy. Triggers may be electrical (pin states, voltage/clock sequences), protocol-level (special patterns on an instrument link), or environmental/temporal (particular temperature ranges, timing offsets). Once on orbit, such pathways are difficult to remove or reconfigure, allowing the attacker to persist by reusing the same physical entry points whenever conditions are met. | |