Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
143 commits
Select commit Hold shift + click to select a range
65314c4
do not exit executing notebooks if one fails
kkappler Aug 3, 2025
3742687
use uv for pytest
kkappler Aug 3, 2025
c13634f
rm incorrect uv prefix
kkappler Aug 4, 2025
531faa1
try better utilizing uv, (test can move away from conda)
kkappler Oct 18, 2025
a5c110f
add some doc
kkappler Oct 18, 2025
b36862e
Update python versions for tests
kkappler Oct 18, 2025
a6e106e
rename yml to yaml
kkappler Oct 18, 2025
ec6da92
Merge branch 'uv' into patches
kkappler Oct 18, 2025
dd739f0
pydantic -- fix imports
kkappler Nov 21, 2025
e27f34b
Update processing_configuration_template.json
kujaku11 Nov 23, 2025
4b740c2
stop auto testing until we address all tests locally.
kujaku11 Nov 23, 2025
714bcf6
Update spectrogram_helpers.py
kujaku11 Nov 28, 2025
1eff691
Update test_issue_139.py
kujaku11 Dec 2, 2025
0a14739
Update config_creator.py
kujaku11 Dec 2, 2025
d0bbde0
updating precommit
kujaku11 Dec 2, 2025
a2c9e6a
Add pytest fixtures and test for TF zrr file roundtrip
kujaku11 Dec 3, 2025
67ee871
Create test_transfer_function_kernel_pytest.py
kujaku11 Dec 3, 2025
056955d
Add synthetic pytest tests and improve test fixtures
kujaku11 Dec 3, 2025
c1473cb
Align num_samples_window in frequency band tests
kujaku11 Dec 3, 2025
543e6bf
Refactor synthetic test MTH5 file creation for test isolation
kujaku11 Dec 3, 2025
0d12513
Migrate synthetic tests from unittest to pytest
kujaku11 Dec 5, 2025
6b51c30
Replace transfer function kernel tests with windowing tests
kujaku11 Dec 5, 2025
a43f6dd
Add pytest suite for xarray_helpers module
kujaku11 Dec 5, 2025
0c2a8da
Remove time series test files
kujaku11 Dec 5, 2025
4d38ac7
Add pytest suite for cross_power transfer function
kujaku11 Dec 5, 2025
3e80920
Add regression tests for helper_functions and remove cross_power tests
kujaku11 Dec 5, 2025
72e11bd
Add regression tests for RegressionEstimator base class
kujaku11 Dec 5, 2025
03627ee
Remove regression test files for transfer function
kujaku11 Dec 5, 2025
e5b55fa
Add comprehensive Parkfield pytest suite and fixtures
kujaku11 Dec 5, 2025
477998f
Update tests.yaml
kujaku11 Dec 5, 2025
cd3f7dc
Refactor feature and channel attribute usage
kujaku11 Dec 5, 2025
5cc6d4b
Update tests.yaml
kujaku11 Dec 5, 2025
403c0e2
Update dataset example for Windows paths and metadata
kujaku11 Dec 5, 2025
4d35d16
skipping notebooks for now
kujaku11 Dec 5, 2025
862ec08
Fix config save and update test signatures
kujaku11 Dec 5, 2025
56ab23a
fix filter additons to use new add_filter method
kkappler Dec 6, 2025
61bb118
force run_id in metadata
kkappler Dec 6, 2025
1098555
update python version info, add some pytest helpers
kkappler Dec 6, 2025
7cf3ae2
Refactor Parkfield tests and fixtures for clarity and robustness
kujaku11 Dec 6, 2025
19eef50
Update test_parkfield_pytest.py
kujaku11 Dec 6, 2025
05a6744
Improve plot handling and test comments, update warnings
kujaku11 Dec 6, 2025
cd6de3c
Improve matplotlib backend handling and test data setup
kujaku11 Dec 8, 2025
d2004fe
Refactor tests to reuse processed transfer functions
kujaku11 Dec 8, 2025
f97161b
Set fixture scope to class for kernel dataset tests
kujaku11 Dec 8, 2025
a733572
Remove obsolete Parkfield test scripts
kujaku11 Dec 8, 2025
e077b6f
Update processing_configuration_template.json
kujaku11 Dec 8, 2025
19d505c
Optimize synthetic test suite with class-scoped fixtures
kujaku11 Dec 8, 2025
c3af257
Fix station metadata by converting timeseries.Station to dict
kujaku11 Dec 9, 2025
ed4d66e
Add ZFile transfer function comparison utilities
kujaku11 Dec 9, 2025
1f4c864
updating how survey metadata is filled
kujaku11 Dec 10, 2025
fd3e9b4
changed default of None to 1
kujaku11 Dec 10, 2025
ffb37fc
Update edf_weights.py
kujaku11 Dec 11, 2025
438ba49
fixing bugs with feature weighting
kujaku11 Dec 12, 2025
f20b578
Update feature_weights.py
kujaku11 Dec 12, 2025
861bf3b
updating logging messages
kujaku11 Dec 12, 2025
f324408
Use persistent cache for Parkfield MTH5 test data
kujaku11 Dec 17, 2025
90d1bd0
Add vectorized pass_band optimization and analysis docs
kujaku11 Dec 17, 2025
784a2d1
Update test_parkfield_pytest.py
kujaku11 Dec 17, 2025
e1727f1
Improve survey metadata handling in TransferFunctionKernel
kujaku11 Dec 17, 2025
201ebfa
removing sandbox test files.
kujaku11 Dec 17, 2025
a79c49d
Add discrete Fourier Coefficients synthetic tests
kujaku11 Dec 18, 2025
f739df7
Enhance synthetic FC tests and add error handling
kujaku11 Dec 20, 2025
64e4894
Refactor and parametrize Fourier Coefficients tests
kujaku11 Dec 20, 2025
bd83c89
Add pytest suite for MATLAB Z-file reader
kujaku11 Dec 21, 2025
d79a21e
Update tests.yaml
kujaku11 Dec 21, 2025
ab00081
Enable parallel test execution and add test dependencies
kujaku11 Dec 21, 2025
ec5bca5
Update transfer_function_kernel.py
kujaku11 Jan 2, 2026
319721c
Update transfer_function_kernel.py
kujaku11 Jan 9, 2026
8ec2684
add ipykernel to dev dependencies
kkappler Jan 9, 2026
899ee02
labelled input arguments
kkappler Jan 9, 2026
8ceb633
update parkfield paths
kkappler Jan 9, 2026
1b9f35b
mark skip TestParkSingleStation.test_singl_station_comparison_with_emtf
kujaku11 Jan 9, 2026
1238fc2
Merge branch 'pydantic' of https://github.com/simpeg/aurora into pyda…
kujaku11 Jan 9, 2026
321f488
fix chained assignment warnings
kkappler Jan 9, 2026
f66ddf0
fix FutureWarnings
kkappler Jan 9, 2026
937b659
fix future warning again
kkappler Jan 10, 2026
d610149
revert parkfield paths to .cache
kkappler Jan 10, 2026
dc36fbc
make plot before asserting numerical comparison
kkappler Jan 10, 2026
16f0a01
Update DataFrame access and notebook outputs
kujaku11 Jan 10, 2026
a47a925
Update process_cas04_multiple_station.ipynb
kujaku11 Jan 10, 2026
9d4c975
Add CAS04 processing tests and analysis docs
kujaku11 Jan 11, 2026
988c12b
Add mth5_test_data to test workflow dependencies
kujaku11 Jan 11, 2026
4a630a9
Add slow test marker and CAS04 test suite README
kujaku11 Jan 11, 2026
ceb6cab
Refactor CAS04 test fixtures for versioned parallelization
kujaku11 Jan 11, 2026
555d053
Update process_cas04_multiple_station.ipynb
kujaku11 Jan 11, 2026
e5ec1bd
Refactor transfer function comparison and remove legacy plotting
kujaku11 Jan 11, 2026
0d58179
Refactor TF comparison tests and move helpers
kujaku11 Jan 11, 2026
4a17257
Enhance TF comparison metrics and add detailed tests
kujaku11 Jan 12, 2026
f8503f7
Refactor and enable single station comparison test
kujaku11 Jan 12, 2026
2a15cbd
Update test_parkfield_pytest.py
kujaku11 Jan 12, 2026
6c830f1
Update test_parkfield_pytest.py
kujaku11 Jan 12, 2026
8ed6c20
Enhance RR comparison test with numerical assertions
kujaku11 Jan 12, 2026
1a5dcd2
Improve test configuration and fixture efficiency
kujaku11 Jan 12, 2026
191f7f4
Add slow test marker and timeout configuration
kujaku11 Jan 12, 2026
9a180b9
Refactor plotting colors and streamline TF plotting
kujaku11 Jan 12, 2026
a0b95c9
Update process_cas04_multiple_station.ipynb
kujaku11 Jan 12, 2026
c52df75
Cache MTH5 test files and improve test fixture persistence
kujaku11 Jan 14, 2026
5a55908
Remove unused 'path' argument in test fixtures
kujaku11 Jan 14, 2026
3b06c04
Refactor MTH5 file fixtures for parallel test safety
kujaku11 Jan 14, 2026
7f1cc21
Add file locking to test data creation for concurrency
kujaku11 Jan 14, 2026
fe3ded7
Add file lock to parkfield_h5_master fixture for concurrency
kujaku11 Jan 14, 2026
386feb1
Refactor test helpers and improve file locking in tests
kujaku11 Jan 14, 2026
6f520b7
Set measurement azimuth and update TF read options in tests
kujaku11 Jan 14, 2026
7f4dbc0
Ensure MTH5 files are closed before renaming in tests
kujaku11 Jan 14, 2026
f5a289d
Change pytest to use auto for parallel tests
kujaku11 Jan 14, 2026
9ef623b
Use versioned cache directories for FDSN MTH5 test files
kujaku11 Jan 14, 2026
24cbc4a
Revert "Use versioned cache directories for FDSN MTH5 test files"
kujaku11 Jan 14, 2026
d5c1806
Revert "Ensure MTH5 files are closed before renaming in tests"
kujaku11 Jan 14, 2026
c28a407
Update edf_weights.py
kujaku11 Jan 14, 2026
2cfd23f
Merge branch 'pydantic' of https://github.com/simpeg/aurora into pyda…
kujaku11 Jan 14, 2026
8c04097
adding kwargs to the call
kujaku11 Jan 14, 2026
d5f8d16
sometimes a decimation level fails
kujaku11 Jan 14, 2026
fc476a6
uncomment notebooks and test 3.10 only
kkappler Jan 16, 2026
6ed7de4
add copy of edi from IRIS SPUD
kkappler Jan 18, 2026
60a076f
run notebook with local version of SPUD TF comparison
kkappler Jan 18, 2026
94aa747
changing tests yaml to pip install of mt-metadata and mth5
kujaku11 Jan 20, 2026
5b49b2f
Disable Jupyter Notebooks execution in tests.yaml
kujaku11 Jan 20, 2026
f979c0d
Implement Jupyter Notebook execution with error handling
kujaku11 Jan 20, 2026
a0659c4
Run just the notebooks catching failures
kujaku11 Jan 20, 2026
2d8545a
Add virtual environment activation to test workflow
kujaku11 Jan 20, 2026
dbffa42
Refactor testing workflow in GitHub Actions
kujaku11 Jan 20, 2026
189881c
Refactor paths for executed notebooks in tests.yaml
kujaku11 Jan 20, 2026
7301800
Update tests.yaml
kujaku11 Jan 20, 2026
d7911c3
Update pip install command to include nbformat
kujaku11 Jan 20, 2026
8b2a339
Activate virtual environment in tests workflow
kujaku11 Jan 20, 2026
1fdefdb
Change Jupyter notebook kernel to python3
kujaku11 Jan 20, 2026
4b7f18c
Fix formatting issues in tests.yaml file paths
kujaku11 Jan 20, 2026
9bf016a
adding simple logging messages
kujaku11 Jan 21, 2026
0ae015a
Merge branch 'pydantic' of https://github.com/simpeg/aurora into pyda…
kujaku11 Jan 21, 2026
1300f61
Update operate_aurora.ipynb
kujaku11 Jan 21, 2026
6c9f01e
Update operate_aurora.ipynb
kujaku11 Jan 21, 2026
cbc89e9
Add step to run tests in GitHub Actions workflow
kujaku11 Jan 21, 2026
2cfe23b
Update mth5 dependency version requirement
kujaku11 Jan 22, 2026
3099246
remove profile results
kkappler Jan 23, 2026
2d00e92
add 3.11 and 3.12 tests back in
kkappler Jan 23, 2026
8b46018
remove profile
kkappler Jan 23, 2026
7449479
remove duplicate compare function
kkappler Jan 23, 2026
c0b26c1
remove duplciate code
kkappler Jan 23, 2026
5e0ada5
install from patches on mth5, mt_metadata
kkappler Jan 24, 2026
2f4fa30
restrict numba version
kkappler Jan 24, 2026
3007798
undo overzealous cleanup of io helper
kkappler Jan 24, 2026
ac0f38e
Merge branch 'patches' into pydantic
kkappler Jan 24, 2026
f4478af
Revert "Merge branch 'patches' into pydantic"
kkappler Jan 25, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
154 changes: 154 additions & 0 deletions .github/workflows/tests.yaml
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jcapriot I'd be curious about your thoughts on changing tests.yaml to use uv instead of conda. I've been moving toward using uv for all my python projects lately. I don't foresee aurora incorporating C or other code anytime in the near future. Any SimPEG-related drawbacks to this change that I may be missing?

Original file line number Diff line number Diff line change
@@ -0,0 +1,154 @@

name: Testing

on:
push:
branches:
- '*'
pull_request:
branches:
- '*'

jobs:
setup-build:
name: Ex1 (${{ matrix.python-version }}, ${{ matrix.os }})
runs-on: ${{ matrix.os }}
defaults:
run:
shell: bash
strategy:
fail-fast: false
matrix:
os: ["ubuntu-latest"]
python-version: ["3.10", "3.11", "3.12"]

steps:
- uses: actions/checkout@v4

- name: Install uv
uses: astral-sh/setup-uv@v3
with:
version: "latest"

- name: Set up Python ${{ matrix.python-version }}
run: uv python install ${{ matrix.python-version }}

- name: Cache MTH5 test files
uses: actions/cache@v4
with:
path: ~/.cache/aurora
key: mth5-test-files-${{ runner.os }}-${{ hashFiles('tests/conftest.py') }}
restore-keys: |
mth5-test-files-${{ runner.os }}-

- name: Create virtual environment and install dependencies
run: |
uv venv --python ${{ matrix.python-version }}
source .venv/bin/activate
uv pip install --upgrade pip
uv pip install -e ".[dev,test]"
# uv pip install mt_metadata[obspy]
uv pip install "mt_metadata[obspy] @ git+https://github.com/kujaku11/mt_metadata.git@patches"
uv pip install git+https://github.com/kujaku11/mth5.git@patches

# uv pip install mth5
uv pip install git+https://github.com/kujaku11/mth5_test_data.git
# Explicitly include nbconvert & ipykernel
uv pip install jupyter nbconvert nbformat ipykernel pytest pytest-cov pytest-timeout codecov
python -m ipykernel install --user --name "python3"

- name: Install system dependencies
run: |
sudo apt-get update
sudo apt-get install -y pandoc

- name: Set kernel and execute Jupyter Notebooks
run: |
source .venv/bin/activate
python << 'EOF'
import nbformat
import subprocess
import sys

notebooks = [
"docs/examples/dataset_definition.ipynb",
"docs/examples/operate_aurora.ipynb",
"docs/tutorials/pkd_units_check.ipynb",
"docs/tutorials/pole_zero_fitting/lemi_pole_zero_fitting_example.ipynb",
"docs/tutorials/processing_configuration.ipynb",
"docs/tutorials/process_cas04_multiple_station.ipynb",
"docs/tutorials/synthetic_data_processing.ipynb"
]

failures = []

for nb_path in notebooks:
# Update kernel spec
print(f"Updating kernel in {nb_path}")
try:
with open(nb_path, "r", encoding="utf-8") as f:
nb = nbformat.read(f, as_version=4)

nb["metadata"]["kernelspec"]["name"] = "python3"
nb["metadata"]["kernelspec"]["display_name"] = "Python (python3)"

with open(nb_path, "w", encoding="utf-8") as f:
nbformat.write(nb, f)
print(f"✓ Updated kernel in {nb_path}")
except Exception as e:
print(f"✗ Failed to update kernel in {nb_path}: {e}")
failures.append(nb_path)
continue

# Execute notebook
print(f"Executing {nb_path}")
result = subprocess.run(
["jupyter", "nbconvert", "--to", "notebook", "--execute", nb_path],
capture_output=True,
text=True
)

if result.returncode != 0:
print(f"✗ Failed to execute {nb_path}")
print(result.stderr)
failures.append(nb_path)
else:
print(f"✓ Successfully executed {nb_path}")

if failures:
print("\n======= Summary =======")
print(f"Failed notebooks: {failures}")
sys.exit(1)
else:
print("\n✓ All notebooks executed successfully!")
EOF

- name: Run Tests
run: |
source .venv/bin/activate
pytest -s -v --cov=./ --cov-report=xml --cov=aurora -n auto tests

- name: "Upload coverage reports to Codecov"
uses: codecov/codecov-action@v4
with:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
fail_ci_if_error: false
flags: tests

# Note: these conditions won't match python-version 3.10; adjust if desired.
- name: Build Doc
if: ${{ (github.ref == 'refs/heads/main') && (matrix.python-version == '3.8') }}
run: |
source .venv/bin/activate
cd docs
make html
cd ..

- name: GitHub Pages
if: ${{ (github.ref == 'refs/heads/main') && (matrix.python-version == '3.8') }}
uses: crazy-max/ghaction-github-pages@v2.5.0
with:
build_dir: docs/_build/html
jekyll: false
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
109 changes: 0 additions & 109 deletions .github/workflows/tests.yml

This file was deleted.

49 changes: 42 additions & 7 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,10 +1,45 @@
# .pre-commit-config.yaml
repos:
- repo: https://github.com/ambv/black
rev: 22.6.0
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
hooks:
- id: black
language_version: python3.10
- repo: https://github.com/pycqa/flake8
rev: 3.9.2
- id: trailing-whitespace
types: [python]
- id: end-of-file-fixer
types: [python]
- id: check-yaml
exclude: '^(?!.*\.py$).*$'

- repo: https://github.com/pycqa/isort
rev: 5.12.0
hooks:
- id: flake8
- id: isort
types: [python]
exclude: (__init__.py)$
files: \.py$
args: ["--profile", "black",
"--skip-glob","*/__init__.py",
"--force-alphabetical-sort-within-sections",
"--order-by-type",
"--lines-after-imports=2"]

- repo: https://github.com/psf/black
rev: 23.3.0
hooks:
- id: black
types: [python]
files: \.py$
language_version: python3

- repo: https://github.com/pycqa/autoflake
rev: v2.1.1
hooks:
- id: autoflake
types: [python]
files: \.py$
args: [
"--remove-all-unused-imports",
"--expand-star-imports",
"--ignore-init-module-imports",
"--in-place"
]
2 changes: 1 addition & 1 deletion aurora/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
"sink": sys.stdout,
"level": "INFO",
"colorize": True,
"format": "<level>{time} | {level: <3} | {name} | {function} | {message}</level>",
"format": "<level>{time} | {level: <3} | {name} | {function} | line: {line} | {message}</level>",
},
],
"extra": {"user": "someone"},
Expand Down
21 changes: 20 additions & 1 deletion aurora/config/config_creator.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
from aurora.config.metadata import Processing
from aurora.sandbox.io_helpers.emtf_band_setup import EMTFBandSetupFile
from mth5.processing.kernel_dataset import KernelDataset
from mt_metadata.transfer_functions.processing.window import Window
from mt_metadata.processing.window import Window

import pathlib

Expand Down Expand Up @@ -127,11 +127,13 @@ def create_from_kernel_dataset(
kernel_dataset: KernelDataset,
input_channels: Optional[list] = None,
output_channels: Optional[list] = None,
remote_channels: Optional[list] = None,
estimator: Optional[str] = None,
emtf_band_file: Optional[Union[str, pathlib.Path]] = None,
band_edges: Optional[dict] = None,
decimation_factors: Optional[list] = None,
num_samples_window: Optional[int] = None,
**kwargs,
) -> Processing:
"""
This creates a processing config from a kernel dataset.
Expand Down Expand Up @@ -166,6 +168,8 @@ def create_from_kernel_dataset(
List of the input channels that will be used in TF estimation (usually "hx", "hy")
output_channels: list
List of the output channels that will be estimated by TF (usually "ex", "ey", "hz")
remote_channels: list
List of the remote reference channels (usually "hx", "hy" at remote site)
estimator: Optional[Union[str, None]]
The name of the regression estimator to use for TF estimation.
emtf_band_file: Optional[Union[str, pathlib.Path, None]]
Expand All @@ -176,6 +180,12 @@ def create_from_kernel_dataset(
List of decimation factors, normally [1, 4, 4, 4, ... 4]
num_samples_window: Optional[Union[int, None]]
The size of the window (usually for FFT)
**kwargs:
Additional keyword arguments passed to Processing constructor. Could contain:
- save_fcs: bool
- If True, save Fourier coefficients during processing.
- save_fcs_type: str
- File type for saving Fourier coefficients. Options are "h5" or "csv".

Returns
-------
Expand Down Expand Up @@ -241,8 +251,17 @@ def create_from_kernel_dataset(
else:
decimation_obj.output_channels = output_channels

if remote_channels is None:
if kernel_dataset.remote_channels is not None:
decimation_obj.reference_channels = kernel_dataset.remote_channels

if num_samples_window is not None:
decimation_obj.stft.window.num_samples = num_samples_window[key]

if kwargs.get("save_fcs", False):
decimation_obj.save_fcs = True
decimation_obj.save_fcs_type = kwargs.get("save_fcs_type", "h5")

# set estimator if provided as kwarg
if estimator:
try:
Expand Down
Loading
Loading