GAN训练 #56

Open
opened 2026-01-29 21:40:14 +00:00 by claunia · 0 comments
Owner

Originally created by @SimKarras on GitHub (Aug 25, 2021).

您好,有一个问题困扰我。在GAN的训练中,参考https://github.com/rosinality/stylegan2-pytorch/blob/master/train.py
在训练Discriminator时,rosinality将Generator的梯度更新关闭:

        requires_grad(generator, False)
        requires_grad(discriminator, True)

同样,训练Generator时,也会将Discriminator的梯度更新关闭:

        requires_grad(generator, True)
        requires_grad(discriminator, False)

我只在您的代码中找到了对Discriminator进行梯度控制,没有对Generator的梯度调节:

        for p in self.net_d.parameters():
            p.requires_grad = False

&

        for p in self.net_d.parameters():
            p.requires_grad = True

1、这是不是意味着Generator始终会得到梯度更新(哪怕是在训练Discriminator时)?如果是这样,是否等价于每份数据都会在Generator前降传播两次呢?
2、如果Generator的梯度更新也会受到调节,请问这是在哪个位置实现的呢?

Originally created by @SimKarras on GitHub (Aug 25, 2021). 您好,有一个问题困扰我。在GAN的训练中,参考https://github.com/rosinality/stylegan2-pytorch/blob/master/train.py 在训练Discriminator时,rosinality将Generator的梯度更新关闭: ``` requires_grad(generator, False) requires_grad(discriminator, True) ``` 同样,训练Generator时,也会将Discriminator的梯度更新关闭: ``` requires_grad(generator, True) requires_grad(discriminator, False) ``` 我只在您的代码中找到了对Discriminator进行梯度控制,没有对Generator的梯度调节: ``` for p in self.net_d.parameters(): p.requires_grad = False ``` & ``` for p in self.net_d.parameters(): p.requires_grad = True ``` 1、这是不是意味着Generator始终会得到梯度更新(哪怕是在训练Discriminator时)?如果是这样,是否等价于每份数据都会在Generator前降传播两次呢? 2、如果Generator的梯度更新也会受到调节,请问这是在哪个位置实现的呢?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: TencentARC/GFPGAN#56