五月天青色头像情侣网名,国产亚洲av片在线观看18女人,黑人巨茎大战俄罗斯美女,扒下她的小内裤打屁股

歡迎光臨散文網(wǎng) 會(huì)員登陸 & 注冊(cè)

吳恩達(dá)DeepLearning.ai之生成對(duì)抗網(wǎng)絡(luò)(GANS)專業(yè)化〔Andre

2023-02-22 22:28 作者:聽聽我的腦洞  | 我要投稿

Generative models

  • Variational Autoencoders
  • Encoder-Latent Space-Decoder
  • Generative Adversarial Networks
  • Generator: learn to produce realistic examples
  • Discriminator: distinguish between fake and real

?
1.6discriminator P6 - 00:19
?

Discriminator

basically, a neural networks classifier

compare the Y^hat with Label Y.

P(Y|X)



?
1.7generator P7 - 00:01
?

Generator

Use noises, put noises into generator, and get X^\hat, then put X^\hat into discriminator, to Y^\hat_d.

Use the Y^\hat the difference between... to update the parameters of generator.

save the parameters





P(X|Y), probability of features given class Y.

Since, we only care about one specific label each time, so it is

P(X|Y)=P(X) here. we can ignore the Y here.

?
1.8bce-cost-function P8 - 00:02
?

BCE cost function.

BCE stands for Binary Cross Entropy Loss equation by parts.

summary over the entire batch, where the summation from i=1 to m indicates


h means the prediction?

y is the label,

theta is the parameter,

x is the features.



?
1.8bce-cost-function P8 - 01:54
?


first item:

if the true y is fake, the value of y is 0.

Then, no matter what the prediction is, the first item is 0.

If the true y is real, then if the prediction has a high probability say 0.99 to be real, the value of the first item is going to be 0.

However, if the prediction is close to 0, then the product of first item is going to be negative infinity.

Hence, negative infinity here indicates bad result.

if the prediction is good,

if the prediction is bad, it goes to -\infinity

second item:


If the prediction is really bad, the value goes to negative infinity.

similarly, negative infinity indicates bad prediction.



?
1.9putting-it-all-together P9 - 00:15
?


for discriminator, pass X^\hat and real X into the discriminator, then BCE.

update the \theta_d (parameter for the discriminator)

want to know the difference between fake and real

generator want the fake things to be as real as possible.



?
2.2activations-basic-properties P12 - 00:07
?

activations

non-linear differential


?
2.3common-activation-functions P13 - 00:25
?


ReLu -dying ReLU when it is negative, it is always 0. loss the information.

Leaky ReLU solves the problem

max(az,z)

a = 0.1

so it is not compare to 0, but a small value.

Sigmoid/Tanh-- vanish gradient and saturation problem



?
2.4batch-normalization-explained P14 - 04:11
?

batch normalization reduce the the covariate shift.

easier to train and speed the training process.

?
2.5batch-normalization-procedure P15 - 02:43
?

norm for training

norm for test fixed values.


?
3.2mode-collapse P21 - 00:40
?




10 modes for numbers


it will converge to 1 mode..that's the problem.



?
3.3problem-with-bce-loss P22 - 03:06
?

vanishing gradients


?
3.4earth-movers-distance P23 - 01:11
?



?
3.5wasserstein-loss P24 - 00:03
?





?
3.6condition-on-wasserstein-critic P25 - 00:13
?




?
3.7 1-lipschitz-continuity-enforcemen P26 - 00:19
?






?
4.2conditional-generation-intuition P28 - 02:05
?



?
4.3conditional-generation-inputs P29 - 02:02
?



?
4.4controllable-generation P30 - 00:19
?

controllable generation control some of the features ...


?
4.5vector-algebra-in-the-z-space P31 - 00:49
?




?
4.6challenges-with-controllable-gener P32 - 01:19
?



?
4.7classifier-gradients P33 - 01:05
?


take advantage of pre-trained classifier.

?
4.8disentanglement P34 - 02:24
?


?
4.8disentanglement P34 - 04:22
?



吳恩達(dá)DeepLearning.ai之生成對(duì)抗網(wǎng)絡(luò)(GANS)專業(yè)化〔Andre的評(píng)論 (共 條)

分享到微博請(qǐng)遵守國(guó)家法律
和平县| 白银市| 盐山县| 青川县| 澄迈县| 台中市| 阳西县| 水城县| 巫山县| 宾川县| 仁布县| 棋牌| 屏东市| 桑日县| 桃源县| 综艺| 镇远县| 云梦县| 道孚县| 伊宁县| 东明县| 广元市| 新宁县| 仁化县| 峨边| 昔阳县| 慈利县| 汤原县| 灵寿县| 虹口区| 日照市| 聂拉木县| 兴化市| 松阳县| 东乡族自治县| 六安市| 乾安县| 独山县| 偏关县| 南昌县| 遂平县|