Skip to content

Latest commit

 

History

History
38 lines (29 loc) · 1.72 KB

File metadata and controls

38 lines (29 loc) · 1.72 KB

Variational Autoencoders

  • We add a constraint on the encoding network, that forces it to generate latent vectors that roughly follow a unit gaussian distribution.
  • Generating new images is now easy: all we need to do is sample a latent vector from the unit gaussian and pass it into the decoder.
image_loss = mean((generated_image - real_image) ** 2)
latent_loss = kl_divergence(latent_variable, unit_gaussian)
loss = image_loss + latent_loss

  • the KL divergence of two gaussians is easy to compute in its closed form.
  • In order to optimize the KL divergence, we need to apply a simple reparameterization trick: instead of the encoder generating a vector of real values, it will generate a vector of means and a vector of standard deviations.
samples = random_normal([batchsize, n_z], mean=0, std=1, dtype=tf.float32)
sampled_z = z_mean + (z_stddev * samples)

More