-
Notifications
You must be signed in to change notification settings - Fork 60
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What is the usage of SECOND_ORDER_GRAD_ITER=0 and self.total_loss1? #11
Comments
MAML requires 2nd-order gradients, which requires large computation. Therefore, SECOND_ORDER_GRAD_ITER is to decide how many steps to approximate the gradients within 1st-order. For the self.total_loss1, you are right. it is not used for the training. You may ignore that loss and corresponding optimizer. |
Dear Sir, |
Please can you kindly explain me how to calculate this weight loss ? def get_loss_weights(self):
|
What is the usage of
SECOND_ORDER_GRAD_ITER=0
andself.total_loss1
?As for
SECOND_ORDER_GRAD_ITER=0
:If we have finished pre-training on large scale datasets, I think it is useless in this meta-transfer learning step.
As for
self.total_loss1
:In this meta-transfer learning step,
total_loss1
is never used for optimizers. Is it correct?The text was updated successfully, but these errors were encountered: