About training with 8 gpus #67

Closed
opened 2026-01-29 21:40:41 +00:00 by claunia · 2 comments
Owner

Originally created by @NNNNAI on GitHub (Sep 15, 2021).

Hi xintao, thanks for sharing your great work.
I currently trying to train GFPGAN with 8 gpus, which means the total batchsize will be double. Should I modified some hyperparameter in the train_gfpgan_v1.yml? Such as the learning rate and the totoal step, etc. Thanks again, have a nice day~.

Originally created by @NNNNAI on GitHub (Sep 15, 2021). Hi xintao, thanks for sharing your great work. I currently trying to train GFPGAN with 8 gpus, which means the total batchsize will be double. Should I modified some hyperparameter in the train_gfpgan_v1.yml? Such as the learning rate and the totoal step, etc. Thanks again, have a nice day~.
Author
Owner

@xinntao commented on GitHub (Sep 17, 2021):

  1. I have also trained with 8 GPUs without modifying the hyper-parameters. The results are Ok with fewer iterations.
    But it may be not the optimal~
@xinntao commented on GitHub (Sep 17, 2021): 1. I have also trained with 8 GPUs without modifying the hyper-parameters. The results are Ok with fewer iterations. But it may be not the optimal~
Author
Owner

@NNNNAI commented on GitHub (Sep 17, 2021):

Thanks for your help! Have a great day ~

@NNNNAI commented on GitHub (Sep 17, 2021): Thanks for your help! Have a great day ~
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: TencentARC/GFPGAN#67