-
Notifications
You must be signed in to change notification settings - Fork 86
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
added setting to skip x epochs when using train_and_test_on_datasets #258
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hello, thanks for the PR.
We can merge it once the comment is addressed and the formatting is done.
To do so, you can run make format
.
baal/modelwrapper.py
Outdated
if patience is not None and (e - best_epoch) > patience and (e > min_epoch_for_es): | ||
# Early stopping | ||
break | ||
hist.append(self.get_metrics()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is appending the metric twice I think. L184 already does it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes I made an mistake in the PR, sorry for that
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @studentWUR,
Thank you so much for your Pull Request. Would you mind pushing the requested change so that we can merge this branch?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry for my late reply; pushing a pull request is new for me; so I am slightly confused. In my forked repo I fixed the abovementioned typo, but how to push this to your git? Can you merge it from mine repo
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
Summary:
For training it is not desirable to test at every epoch. This simple update solves this issue. By using a skip_epochs option in train_and_test_on_datasets
Checklist:
tests/documentation_test.py
).