From the course: Evaluating and Debugging Generative AI

Unlock the full course today

Join today to access over 23,400 courses taught by industry experts.

Identify common model issues

Identify common model issues

Imagine you're a digital sculptor crafting statues not from clay, but from data and algorithms. What would you do if your tools started acting up? What if your digital chisel begins chipping away at the same piece of the statue over and over? In AI, this is known as mode collapse and vanishing gradients. You can face these two problems when training generative models such as GANs and neural networks, mode collapse and vanishing gradients. Let's look at mode collapse first. Mode collapse typically occurs when dealing with GANs. If you recall, GANs have a generator and a discriminator. You'll know you've successfully trained a GAN when two things happen. First, the generator can consistently generate data that fools the discriminator, and second, the generator generates diverse data samples. Mode collapse happens when the generator produces a limited variety of data samples. The generator often fools the discriminator by reusing the same realistic data sample over and over again…

Contents