Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mini-batch size - 8 or 16? #30

Open
marijavella opened this issue Mar 5, 2020 · 1 comment
Open

Mini-batch size - 8 or 16? #30

marijavella opened this issue Mar 5, 2020 · 1 comment

Comments

@marijavella
Copy link

Hi,

In the paper, it is stated that 8 LR colour patches of size 48x48 are used for training. However, in the default settings, the mini-batch size is 16. What settings need to be used to match the results in the paper?

When I reduced the batch size to 10 due to memory limitations in the GPU, the PSNR of Set5x2 was about 37.8dB and stopped improving after 590 epochs. This deviates from the results in the paper.

I am aware that a few questions have been asked about the batch size, but I couldn't find an answer to it. Does anyone have any information about this? Thanks.

@EchoXu98
Copy link

Hi, How many GPUs do you use to train this model?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants