Skip to content

added Main for running with different#1

Open
Artemis2467 wants to merge 4 commits intomainfrom
new-features
Open

added Main for running with different#1
Artemis2467 wants to merge 4 commits intomainfrom
new-features

Conversation

@Artemis2467
Copy link
Owner

hyperparameters

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: b49f7a9473

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

from Functions import LinearConfig, plot_loss, run_batch, test_model, graph_roc_curve

def linear_train(config, model_pth):
config = LinearConfig()

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Preserve the caller's LinearConfig in linear_train

The new sweep in code/Main.py:59-65 mutates config.d_model and config.learning_rate before each call, but linear_train() immediately replaces that object with a fresh LinearConfig. Every non-default run therefore trains the default 32-hidden-unit / 0.1-LR model while the console output and linear_history.txt record it as a different hyperparameter combination, which makes the linear search results unusable.

Useful? React with 👍 / 👎.


def logprob_train(config, model_pth):

config = LogitConfig()

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Preserve the caller's LogitConfig in logprob_train

code/Main.py:74-87 iterates over add_conv, d_model, conv_ch, and learning_rate, but logprob_train() discards those values by constructing a new default LogitConfig here. As written, the 60-run search will keep retraining the same no-CNN, 32-dimension, 0.1-LR model instead of the requested variants, so the comparison this commit adds is invalid for every non-default logprob run.

Useful? React with 👍 / 👎.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant