-
Notifications
You must be signed in to change notification settings - Fork 128
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Uniform initialization interval #244
Comments
This is definitely a bug (whether in our configs or the code is a matter of interpretation). The idea was to have symmetric intervals, i.e. set a = -b, where a and b are lower and upper bound, respectively. The current code only sets the lower bound when it isn't given, meaning our search configs, which are setting the lower bound, are using the default value of the upper bound b = 1.0. The desired behavior can be obtained by setting "b" instead of "a" in the configs. Codewise, I suggest a general solution that keeps the idea of using symmetric intervals. So, we'd set any non-given bound to be -1 times the given bound. We should use clearer names too, so lower_bound and upper_bound instead of torch's "a" and "b". Finally, we could decide to check that the given lower bound is indeed lower than the upper one, but torch let's that fly, so I suggest we do the same. @rgemulla thoughts? |
I think low/high is better than a/b. It's perhaps cleanest to disable any automatic setting of non-given values (and use defaults 0/1). |
I now think the cleanest thing is to just get rid of the automatic setting of non-given values and leave the rest as is. Sure, using lower_bound/upper_bound would be clearer than torch's a and b, but it's nicer to leave a and b so the use of all initialization methods is consistent with torch's documentation. @rgemulla if you agree, I'll drop the automatic settings. Otherwise, I'll add a line to rewrite the lower_bound/upper_bound keys as a and b to then pass it to torch |
Yes, agreed. If only one is set, issue a warning with the actual bounds being used. If bounds are inconsistent, raise an error. |
Inconsistent how? Pytorch actually does not allow the lower bound to be
higher than the upper bound (I had tested it wrong before), so we may just
want Pytorch to handle that
…On Fri, 3 Dec 2021 at 14:12, Rainer Gemulla ***@***.***> wrote:
Yes, agreed. If only one is set, issue a warning with the actual bounds
being used. If bounds are inconsistent, raise an error.
—
You are receiving this because you were assigned.
Reply to this email directly, view it on GitHub
<#244 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABEWXZBXXTTUJL5R26LDCMDUPC645ANCNFSM5IG6WNHQ>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
|
Fine, as long as it's signaled via an error. |
Hello,
First of all thanks for your work, very helpful framework!
I've got a quick question about hyperparameter search and the uniform initialization interval. In the configs used for the paper you set
and in the table regarding the hyperparameter search space in the paper you say:
so I would assume that
b = -1 * a
, however in the codebase it says the opposite (a = -1 * b
), ist this a bug or did I overlook something?kge/kge/model/kge_model.py
Lines 76 to 80 in d063477
The text was updated successfully, but these errors were encountered: