-
Notifications
You must be signed in to change notification settings - Fork 737
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Operations when merging EBM #532
Comments
Hi @JWKKWJ123 -- I'm not aware of a paper that describes the merge function. The effect of merge_ebms() is the same as traditional model ensembling, which you could do on any model. In the case of EBMs though, since the predictions from the EBM come from additive functions, instead of averaging the predictions after they've been made, you can through associativity move the addition back to the partial response functions. There's more complexity in practice because the bins from the EBMs don't necessarily line up. We handle it by making new bins that are a superset of all the bins between the EBMs being merged. |
Dear Paul, |
Hi @JWKKWJ123 -- The range of x should be from -inf to +inf. Using your example mostly, let's say I have: EBM1, bin_range:score EBM2, bin_range:score The new bins and scores will be: And if EBM1 and EBM2 have intercepts, then take the average of that too. |
Hi Paul, |
max_bins only applies when fitting EBMs. If you merge EBMs afterwards, there is no upper limit. max_bins only applies to continuous features too, BTW. |
Hi Paul, |
Hi all,
Now I am using ebm.merge( ). However, I haven't found the formula or pseudocode about the merge of EBMs. If I want to explain the merging of EBMs to others, is there a paper/formula/pseudocode that I can refer?
The text was updated successfully, but these errors were encountered: