Skip to content

Conversation

@vijk777
Copy link
Collaborator

@vijk777 vijk777 commented Jan 23, 2026

extract shared training utilities and configuration classes into separate modules to prepare for latent trajectory model implementation.

created new modules:

  • training_utils.py: LossAccumulator (enum-based), seed_everything, get_device
  • training_config.py: DataSplit, ProfileConfig, TrainingConfig, CrossValidationConfig

moved code between modules:

  • ModelParams: latent.py → eed_model.py (model-specific config)
  • load_column_slice, load_metadata: zarr_io.py → load_flyvis.py (flyvis-specific)
  • DataSplit: load_flyvis.py → training_config.py (training concept)

updated imports:

  • chunk_streaming.py, post_run_analyze.py, benchmark_training.py: zarr_io → load_flyvis
  • latent.py: added load_dataset and load_val_only functions, updated to use new modules
  • zarr_io.py: removed flyvis-specific dependencies

benefits:

  • no circular dependencies
  • proper module layering (general doesn't depend on specific)
  • shared code extracted for reuse across models
  • LossAccumulator is now generic and reusable

🤖 Generated with Claude Code

extract shared training utilities and configuration classes into separate modules to prepare for latent trajectory model implementation.

created new modules:
- training_utils.py: LossAccumulator (enum-based), seed_everything, get_device
- training_config.py: DataSplit, ProfileConfig, TrainingConfig, CrossValidationConfig

moved code between modules:
- ModelParams: latent.py → eed_model.py (model-specific config)
- load_column_slice, load_metadata: zarr_io.py → load_flyvis.py (flyvis-specific)
- DataSplit: load_flyvis.py → training_config.py (training concept)

updated imports:
- chunk_streaming.py, post_run_analyze.py, benchmark_training.py: zarr_io → load_flyvis
- latent.py: added load_dataset and load_val_only functions, updated to use new modules
- zarr_io.py: removed flyvis-specific dependencies

benefits:
- no circular dependencies
- proper module layering (general doesn't depend on specific)
- shared code extracted for reuse across models
- LossAccumulator is now generic and reusable

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
@vijk777 vijk777 merged commit 3546457 into main Jan 23, 2026
2 checks passed
@vijk777 vijk777 deleted the vj/refactor1 branch January 23, 2026 01:12
@vijk777
Copy link
Collaborator Author

vijk777 commented Jan 23, 2026

Results identical up to epoch 5
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants