Train with GPU and inference without GPU. Is it possible ? #52

Open
opened 2026-01-29 21:40:00 +00:00 by claunia · 0 comments
Owner

Originally created by @MDYLL on GitHub (Aug 18, 2021).

Hello :)
One more - thank you very much for your beatifull project !

  1. I trained model on my ouw dataset - mymodel.pth
  2. I ran inference on CPU with your model - GFPGANCleanv1-NoCE-C2.pth
  3. I see that GFPGANv1.pth (and mymodel.pth) has 2x size that GFPGANCleanv1-NoCE-C2.pth

So, how I can transform mymodel.pth for using inference on CPU ? or may be I should train anotther model ?

Thank you :))

Originally created by @MDYLL on GitHub (Aug 18, 2021). Hello :) One more - thank you very much for your beatifull project ! 1. I trained model on my ouw dataset - mymodel.pth 2. I ran inference on CPU with your model - GFPGANCleanv1-NoCE-C2.pth 3. I see that GFPGANv1.pth (and mymodel.pth) has 2x size that GFPGANCleanv1-NoCE-C2.pth So, how I can transform mymodel.pth for using inference on CPU ? or may be I should train anotther model ? Thank you :))
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: TencentARC/GFPGAN#52