mirror of
https://github.com/TencentARC/GFPGAN.git
synced 2026-02-17 06:44:33 +00:00
Possible to use Real-ESRGAN on mac os AMD, and 16-bit images? #103
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @KeygenOld on GitHub (Nov 11, 2021).
Faces work excellent with GFPGAN, it's pretty amazing, but without CUDA support, the backgrounds are left pretty nasty. Still amazed at how fast it runs on CPU (only a few seconds per image, even with scale of 8), granted it's using the non-color version, but I have colorizing software and do the rest manually. I've been unsuccessful trying to enable color using the Paper model and forcing CPU.
More importantly than color, as I use other software for colorization, is the background. I need to get the backgrounds restored and not just faces.
I've tried setting
--bg_upsampler realesrganflag, and it does not throw an error, but this seems to have no effect on the output image. I do get the warning though that Real-ESRGAN is slow and not used for CPU. Is it possible to enable Real-ESRGAN on macOS so that it uses AMD GPU and restores the background (I have a desktop with Pro Vega 64)? I saw the other Real-ESRGAN compiled for mac/AMD, maybe the two can be linked somehow?If it can't use the AMD GPU, can it be forced to use the CPU? I don't care if it's slow, I just need it to work. :) I do a lot of rendering that is slow, because sometimes it's the only way. Main thing is getting it to work.
Also, is it possible to enable the use of 16-bit PNG, TIFF, or cinema DNG? Would be really cool if it could support 32-bit float TIFF or EXR.
Thank you
@KeygenOld commented on GitHub (Nov 12, 2021):
I managed to figure how to force the CPU to be used if you don't have a CUDA card. There may be a better way, but I am getting results. It might take a minute or a few minutes with larger images, but it's not so slow that it is unusable. Still rather fast imo.
You need to modify the code in the
inference_gfpgan.pyfile to make sure RealESRGANer is used instead of throwing a warning.Change this code:
To this code:
When you run the script it will now show you Tiles being rendered instead of just the file names. There will be more tiles with larger images.
If anyone can shed light on how to get this working with an AMD GPU on the mac side, I would appreciate it, as well using images with higher bit depth (16-bit, 32-bit per channel).