Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
09f0c07
fix(pyats): prevent credential exposure in PyATS archives (#689)
oboehmer Mar 22, 2026
dac1aa1
fix(security): auto-cleanup merged_data_model_test_variables.yaml (#677)
oboehmer Mar 23, 2026
d6107c4
Merge remote-tracking branch 'origin/main' into fix/677-merged-data-c…
oboehmer Mar 24, 2026
a27edf4
Merge remote-tracking branch 'origin/main' into fix/677-merged-data-c…
oboehmer Mar 25, 2026
69191ba
refactor: pass merged_data dict through Robot path, simplify DataMerger
oboehmer Mar 25, 2026
82d2333
fix: correct signal handler restoration for SIG_DFL (falsy value 0)
oboehmer Mar 28, 2026
2de08ba
refactor: centralise merged data file path via DataMerger
oboehmer Mar 28, 2026
b5984a8
refactor: remove exit_with_cleanup closure from main()
oboehmer Mar 28, 2026
211b88d
feat: preserve debug-relevant files when NAC_TEST_DEBUG is set
oboehmer Mar 28, 2026
64bba1e
test: assert merged data YAML is absent after a successful run
oboehmer Mar 28, 2026
ead4ae1
restore comments in main
oboehmer Mar 28, 2026
a79a3b2
refactor: place merged_data before optional params in orchestrator si…
oboehmer Mar 28, 2026
6007905
fix: log full exception traceback to file while keeping terminal outp…
oboehmer Mar 28, 2026
9d0f71b
Merge remote-tracking branch 'origin/main' into fix/677-704-merged-da…
oboehmer Mar 28, 2026
403db83
refactor: resolve merged_data_path to absolute once at construction
oboehmer Mar 28, 2026
7891146
register DeviceExecutor temp files with CleanupManager
oboehmer Mar 28, 2026
ef0e398
extract _setup_run_tests_mocks helper in test_orchestrator
oboehmer Mar 28, 2026
2d3e993
tests: remove duplicate test_merged_data_path_matches_written_file
oboehmer Mar 28, 2026
2b644c5
tests: add integration test for CleanupManager atexit via typer.Exit(0)
oboehmer Mar 28, 2026
dd5c8ff
refactor: rename skip_if_debug to keep_if_debug in CleanupManager
oboehmer Mar 28, 2026
6a52131
tests: add subprocess-level signal coverage for CleanupManager
oboehmer Mar 28, 2026
9cdf355
remote trivial changes from new test_robot_writer.py
oboehmer Mar 28, 2026
e931ada
fix: guard bad directive syntax, fix stale docstring, use LOG_HTML co…
oboehmer Mar 28, 2026
740fe99
add PRD, adjust comment
oboehmer Mar 28, 2026
a9bfb7f
minor adjustments in cleanup.py
oboehmer Mar 28, 2026
03c7b94
tests: restore integration rendering tests and fix fixture placement
oboehmer Mar 28, 2026
c0a8442
refactor: replace _cleanup/cleanup_now with single public run_cleanup()
oboehmer Mar 28, 2026
7156e72
reference MERGED_DATA_FILENAME in tests
oboehmer Mar 28, 2026
2c19d62
add comment on fork safety
oboehmer Mar 28, 2026
56a3762
tests: improve CleanupManager unit test quality
oboehmer Mar 28, 2026
252fb78
update PRD
oboehmer Mar 28, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 1 addition & 11 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,8 +58,6 @@ $ nac-test --help
│ --verbose Enables verbose output for nac-test, │
│ Robot and PyATS execution. |
│ [env var: NAC_TEST_VERBOSE] │
│ --merged-data-file… -m TEXT Filename for merged data model. │
│ [default: merged_data_model_test...] │
│ --loglevel -l [DEBUG|...] Log level. [default: WARNING] │
│ --version Display version number. │
│ --help Show this message and exit. │
Expand Down Expand Up @@ -372,17 +370,9 @@ Before test execution, `nac-test` merges all YAML data files into a single data

1. All files from `--data` paths are recursively loaded
2. YAML structures are deep-merged (later files override earlier ones)
3. The merged result is written to the output directory
3. The merged result is written to the output directory as `merged_data_model_test_variables.yaml`
4. Both Robot and PyATS tests reference this merged data

### Custom Filename

By default, the merged file is named `merged_data_model_test_variables.yaml`. You can customize this:

```bash
nac-test -d ./data -t ./tests -o ./output -m my_custom_data.yaml
```

### Accessing the Merged Data

The merged data model is available to:
Expand Down
73 changes: 53 additions & 20 deletions dev-docs/PRD_AND_ARCHITECTURE.md
Original file line number Diff line number Diff line change
Expand Up @@ -842,20 +842,6 @@ nac-test provides 13 CLI flags covering data input, template configuration, exec
nac-test -d config/ -t templates/ --tests custom_tests/ -o output/
```

**`-m, --merged-data-filename`** (OPTIONAL)
- **Type**: `str`
- **Default**: `"merged_data_model_test_variables.yaml"`
- **Environment Variable**: None (CLI only)
- **Purpose**: Custom filename for merged data model YAML (SOT shared across frameworks)
- **Behavior**:
- Written to `{output_dir}/{merged_data_filename}`
- Passed to CombinedOrchestrator constructor
- Both PyATS and Robot read from this shared file
- **Example**:
```bash
nac-test -d config/ -t templates/ -o output/ -m merged_data_prod.yaml
```

##### Execution Control Flags

**`-i, --include`** (OPTIONAL)
Expand Down Expand Up @@ -1296,7 +1282,7 @@ exit() # Checks error_handler.fired for exit code

#### Environment Variable Support

Every CLI flag (except `--version` and `--merged-data-filename`) supports environment variable configuration for CI/CD and containerized environments.
Every CLI flag (except `--version`) supports environment variable configuration for CI/CD and containerized environments.

**Environment Variable Mapping Table**:

Expand Down Expand Up @@ -6072,16 +6058,63 @@ class SystemResourceCalculator:

### Cleanup Utilities (`utils/cleanup.py`)

Resource cleanup on exit:
#### CleanupManager

`CleanupManager` is a process-wide singleton that ensures registered files
are deleted when the process exits, regardless of how it exits.

**Triggers covered:**
- Normal exit via `atexit` (including `typer.Exit`)
- `SIGTERM` (e.g. `docker stop`, `kill`) — Unix only
- `SIGINT` (Ctrl+C / `KeyboardInterrupt`) — Unix only

`SIGKILL` cannot be intercepted; files may remain if the process is killed
with `-9`.

**Key methods:**

```python
def cleanup_temp_files(patterns: List[str]) -> None:
"""Remove temporary files matching patterns."""
cleanup = get_cleanup_manager() # always returns the singleton

def cleanup_output_directory(output_dir: Path) -> None:
"""Clean up output directory before run."""
cleanup.register(path) # delete path on exit
cleanup.register(path, keep_if_debug=True) # skip deletion when NAC_TEST_DEBUG=true
cleanup.unregister(path) # cancel a previously registered path
cleanup.run_cleanup() # trigger cleanup immediately (idempotent)
```

**`keep_if_debug` flag**

Files registered with `keep_if_debug=True` are retained when
`NAC_TEST_DEBUG=true` is set. Use this for intermediate files that are
useful for debugging (job scripts, per-device testbed YAMLs, merged data
model). Sensitive files (e.g. files containing credentials) should always
be registered without this flag so they are unconditionally removed.

**Thread safety:** all operations are protected by an internal lock.

**Fork safety:** the singleton must not be used in forked child processes.
If a process is forked while `CleanupManager` is initialised, the child
inherits the lock in an undefined state and cleanup behaviour is
unpredictable. Subprocesses spawned via `subprocess.Popen` are unaffected
— they start a fresh interpreter with no inherited singleton state.

**Signal handler behaviour:** on SIGTERM the original handler is restored
and the signal is re-raised via `signal.raise_signal()` so upstream
process managers (Docker, systemd) see the expected exit status. On SIGINT
the original handler is restored and `KeyboardInterrupt` is raised so the
normal Python exception handling chain proceeds.

#### PyATS runtime cleanup helpers

Three standalone functions handle CI/CD-specific directory hygiene (not
related to `CleanupManager`):

| Function | Purpose |
|---|---|
| `cleanup_pyats_runtime(workspace_path)` | Removes the `.pyats/` runtime directory before a run to prevent disk exhaustion |
| `cleanup_old_test_outputs(output_dir, days)` | Removes `.jsonl` result files older than `days` days |
| `cleanup_stale_test_artifacts(output_dir)` | Removes `api/`, `d2d/`, and `default/` subdirectories containing stale JSONL files from interrupted runs |

### Environment Utilities (`utils/environment.py`)

Environment variable handling:
Expand Down
30 changes: 12 additions & 18 deletions nac_test/cli/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@
EXIT_INVALID_ARGS,
)
from nac_test.data_merger import DataMerger
from nac_test.utils.cleanup import get_cleanup_manager
from nac_test.utils.formatting import format_duration
from nac_test.utils.logging import (
DEFAULT_LOGLEVEL,
Expand Down Expand Up @@ -173,16 +174,6 @@ def version_callback(value: bool) -> None:
]


MergedDataFilename = Annotated[
str,
typer.Option(
"-m",
"--merged-data-filename",
help="Filename for the merged data model YAML file.",
),
]


RenderOnly = Annotated[
bool,
typer.Option(
Expand Down Expand Up @@ -323,7 +314,6 @@ def main(
version: Version = False, # noqa: ARG001
diagnostic: Diagnostic = False, # noqa: ARG001
verbose: Verbose = False,
merged_data_filename: MergedDataFilename = "merged_data_model_test_variables.yaml",
) -> None:
"""A CLI tool to render and execute Robot Framework and PyATS tests using Jinja templating.

Expand Down Expand Up @@ -394,7 +384,11 @@ def main(
typer.echo("\n\n📄 Merging data model files...")

merged_data = DataMerger.merge_data_files(data)
DataMerger.write_merged_data_model(merged_data, output, merged_data_filename)
merged_data_path = DataMerger.write_merged_data_model(merged_data, output)

# Register merged data file for cleanup on exit/signal (SIGTERM/SIGINT)
cleanup_manager = get_cleanup_manager()
cleanup_manager.register(merged_data_path, keep_if_debug=True)

duration = (datetime.now() - start_time).total_seconds()
typer.echo(f"✅ Data model merging completed ({format_duration(duration)})")
Expand All @@ -403,9 +397,9 @@ def main(
orchestrator = CombinedOrchestrator(
data_paths=data,
templates_dir=templates,
custom_testbed_path=testbed,
output_dir=output,
merged_data_filename=merged_data_filename,
merged_data=merged_data,
custom_testbed_path=testbed,
filters_path=filters,
tests_path=tests,
include_tags=include,
Expand Down Expand Up @@ -439,11 +433,11 @@ def main(
raise typer.Exit(EXIT_INTERRUPTED) from None
except Exception as e:
# Infrastructure errors (template rendering, controller detection, etc.)
logger.exception("Unexpected error during execution")
typer.echo(f"Error during execution: {e}")
# Progressive disclosure: clean output for customers, full context for developers
if DEBUG_MODE:
raise typer.Exit(EXIT_ERROR) from e # Developer: full exception context
raise typer.Exit(EXIT_ERROR) from None # Customer: clean output
# Pretty exception display is controlled by pretty_exceptions_enable=DEBUG_MODE
# on the Typer app — no need for a separate DEBUG_MODE branch here.
raise typer.Exit(EXIT_ERROR) from e

# Display total runtime before exit
runtime_end = datetime.now()
Expand Down
15 changes: 8 additions & 7 deletions nac_test/combined_orchestrator.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
import logging
import os
from pathlib import Path
from typing import Any

import typer

Expand Down Expand Up @@ -74,7 +75,7 @@ def __init__(
data_paths: list[Path],
templates_dir: Path,
output_dir: Path,
merged_data_filename: str,
merged_data: dict[str, Any] | None = None,
filters_path: Path | None = None,
tests_path: Path | None = None,
include_tags: list[str] | None = None,
Expand All @@ -97,7 +98,7 @@ def __init__(
data_paths: List of paths to data model YAML files
templates_dir: Directory containing test templates and PyATS test files
output_dir: Base directory for test output
merged_data_filename: Name of the merged data model file
merged_data: Already-loaded merged data model dict (avoids re-reading from disk)
filters_path: Path to Jinja filters (Robot only)
tests_path: Path to Jinja tests (Robot only)
include_tags: Tags to include (Robot only)
Expand All @@ -117,7 +118,9 @@ def __init__(
self.data_paths = data_paths
self.templates_dir = Path(templates_dir)
self.output_dir = Path(output_dir)
self.merged_data_filename = merged_data_filename
self.merged_data: dict[str, Any] = (
merged_data if merged_data is not None else {}
)

# Robot-specific parameters
self.filters_path = filters_path
Expand Down Expand Up @@ -213,7 +216,6 @@ def run_tests(self) -> CombinedResults:
data_paths=self.data_paths,
test_dir=self.templates_dir,
output_dir=self.output_dir,
merged_data_filename=self.merged_data_filename,
minimal_reports=self.minimal_reports,
custom_testbed_path=self.custom_testbed_path,
controller_type=self.controller_type,
Expand All @@ -233,10 +235,9 @@ def run_tests(self) -> CombinedResults:
typer.echo(f"\n🤖 Running Robot Framework tests{mode_suffix}...\n")

robot_orchestrator = RobotOrchestrator(
data_paths=self.data_paths,
templates_dir=self.templates_dir,
output_dir=self.output_dir,
merged_data_filename=self.merged_data_filename,
merged_data=self.merged_data,
filters_path=self.filters_path,
tests_path=self.tests_path,
include_tags=self.include_tags,
Expand Down Expand Up @@ -405,7 +406,7 @@ def _print_execution_summary(
if combined_dashboard.exists():
typer.echo(f"Dashboard: {combined_dashboard.resolve()}")
if results.robot is not None and not results.robot.is_empty:
robot_log = self.output_dir / ROBOT_RESULTS_DIRNAME / "log.html"
robot_log = self.output_dir / ROBOT_RESULTS_DIRNAME / LOG_HTML
if robot_log.exists():
typer.echo(f"Robot: {robot_log.resolve()}")
if results.api is not None and not results.api.is_empty:
Expand Down
3 changes: 3 additions & 0 deletions nac_test/core/constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -84,6 +84,9 @@
REPORT_HTML: str = "report.html"
XUNIT_XML: str = "xunit.xml"
ORDERING_FILENAME: str = "ordering.txt"
MERGED_DATA_FILENAME: str = "merged_data_model_test_variables.yaml"
# Owner read/write only — prevents other users from reading sensitive merged data
MERGED_DATA_FILE_MODE: int = 0o600
SUMMARY_SEPARATOR_WIDTH: int = 70

# HTTP status code range boundaries
Expand Down
51 changes: 33 additions & 18 deletions nac_test/data_merger.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,18 @@
"""Shared data merging utilities for both Robot and PyATS test execution."""

import logging
import os
from pathlib import Path
from typing import Any

from nac_yaml import yaml

from nac_test.core.constants import (
IS_WINDOWS,
MERGED_DATA_FILE_MODE,
MERGED_DATA_FILENAME,
)

logger = logging.getLogger(__name__)


Expand All @@ -32,33 +39,41 @@ def merge_data_files(data_paths: list[Path]) -> dict[str, Any]:
# Ensure we always return a dict, even if yaml returns None
return data if isinstance(data, dict) else {}

@staticmethod
def merged_data_path(output_directory: Path) -> Path:
"""Return the path where the merged data model file will be written.

Single source of truth for locating the merged data file — use this
instead of constructing the path manually from the constant.

Args:
output_directory: Base output directory passed to write_merged_data_model()

Returns:
Full path to the merged data model YAML file
"""
return output_directory / MERGED_DATA_FILENAME

@staticmethod
def write_merged_data_model(
data: dict[str, Any],
output_directory: Path,
filename: str = "merged_data_model_test_variables.yaml",
) -> None:
) -> Path:
"""Write merged data model to YAML file.

The output filename is always MERGED_DATA_FILENAME — the single fixed
location used by all consumers (Robot, PyATS subprocesses, cleanup).

Args:
data: The merged data dictionary to write
output_directory: Directory where the YAML file will be saved
filename: Name of the output YAML file
"""
full_output_path = output_directory / filename
logger.info("Writing merged data model to %s", full_output_path)
yaml.write_yaml_file(data, full_output_path)

@staticmethod
def load_yaml_file(file_path: Path) -> dict[str, Any]:
"""Load a single YAML file from the provided path.

Args:
file_path: Path to the YAML file to load

Returns:
Loaded dictionary from the YAML file
Path to the written file (use this instead of reconstructing the path)
"""
logger.info("Loading yaml file from %s", file_path)
data = yaml.load_yaml_files([file_path])
return data if data is not None else {}
full_output_path = DataMerger.merged_data_path(output_directory)
logger.info("Writing merged data model to %s", full_output_path)
yaml.write_yaml_file(data, full_output_path)
if not IS_WINDOWS:
os.chmod(full_output_path, MERGED_DATA_FILE_MODE)
return full_output_path
Loading
Loading