1/ Imagine you have two friends, let's call them the "artist" and the "critic." The artist wants to draw something cool, and the critic wants to judge if the drawing is good or not. The artist tries to draw something, and the critic looks at it and says whether it's good or bad.
2/ Now, the artist really wants to improve, so they keep drawing and the critic keeps judging. Over time, the artist gets better and better at drawing because they learn from the critic's feedback. The artist wants to make drawings that the critic will say are really amazing!
3/ In a GAN, the artist is actually a computer program called a "generator." It tries to create new things, like pictures, by making random guesses. The critic is called a "discriminator." It looks at pictures made by generator and tries to figure out if they are real or fake.
4/ The generator and discriminator play a game together. The generator creates a picture, and the discriminator looks at it and decides if it's real or fake. Then, the discriminator gives feedback to the generator, telling it how it can improve.
5/ The generator learns from this feedback and tries to create better pictures. They keep playing this game, and the generator keeps getting better at making pictures that look real!
6/ The generator tries to make things, and the discriminator tries to figure out if they are real or fake. By playing this game, the generator learns and gets better at creating realistic things like pictures or even music.
7/ GAN is a type of machine learning model that consists of two main components: a generator and a discriminator.
The goal of a GAN is to train generator & discriminator together in a competitive setting, where they both try to outsmart each other.
8/ As training progresses, the generator learns to generate samples that are increasingly similar to the real data, while the discriminator gets better at distinguishing between real and fake samples. They have the ability to capture complex patterns and distributions.
9/ Implementation -
10/ GAN Training Algorithm and Loss Functions Implementation --we define generator and discriminator models, loss functions, and the training loop. The training loop is executed using TensorFlow's tf.GradientTape() to compute and apply gradients to update the model's parameters.
11/ UpSampling2D and Conv2DTranspose Layers for GAN's Implementation - we define a generator model that uses Conv2DTranspose layers to upsample the input noise and generate images. The discriminator model uses Conv2D layers to process and analyze the images.
12/ GAN Hacks in Keras to Train Stable Models Implementation - When training GANs, there are several techniques or "hacks" that can help improve stability and convergence.
13/ ( For previous code) Discriminator loss with label smoothing: Instead of using a constant value of 1 for real labels and 0 for fake labels, we slightly smooth the labels. This helps prevent the discriminator from becoming overconfident and helps with stability during training
14/ ( For previous code) Generator loss with label flipping: Instead of using a constant value of 1 for the target labels of the generator, we randomly flip the labels between 0 and 1. This technique can help prevent the generator from collapsing to a limited set of outputs.
15/ GAN to Generate CIFAR10 Photo Implementation -
16/ The generator takes random noise as input and generates CIFAR-10-like images. The discriminator aims to distinguish between real CIFAR-10 images and fake/generated images produced by generator. The GAN model combines generator and discriminator to train them simultaneously.
17/ During the training loop, train_step function is called for each batch of real images. Inside function, generator generates fake images based on random noise. Then, both real and fake images are fed into the discriminator, which outputs probabilities for their authenticity.
18/ The generator loss is calculated based on the discriminator's output for the fake images, aiming to fool the discriminator. The discriminator loss is calculated based on its outputs for both real and fake images.
19/ The gradients of the generator and discriminator losses are computed using gradient tape, and then the optimizer applies the gradients to update the trainable variables of the generator and discriminator models.
20/ CycleGAN Models From Scratch - to have two generator models and two discriminator models. One generator is responsible for converting images from the source domain to the target domain, and the other generator converts images from the target domain back to the source domain
21/ Implementation -
22/ Generative Power: GANs are capable of generating realistic and high-quality synthetic data. They can learn the underlying data distribution and produce new samples that resemble training data, making them useful for tasks such as image synthesis, text generation etc
23/ Domain Adaptation and Style Transfer: GANs can be used for domain adaptation, where they learn to transform data from one domain to another. This enables app such as style transfer, where GANs can transfer style of an image from one domain to another while preserving content.
24/ Data Augmentation: It's used to augment existing training datasets. By generating synthetic samples, GANs can increase the diversity and quantity of data, which can help improve the generalization and robustness of machine learning models. More - shorturl.at/eCK78
• • •
Missing some Tweet in this thread? You can try to
force a refresh
1. Split data using pandas
In the code below, we are splitting the data into a random sample of rows and removing them from the original data after dropping index values.
2. Binning Data
Binning is a technique to group/bin your data into multiple buckets which is very helpful if you dealing with continuous numeric data. In pandas you can bin the data using functions cut and cut. First check the shape of your data i.e no of rows and columns.
1/ Indexing data frames
Indexing means to selecting all/particular rows and columns of data from a DataFrame. In pandas it can be done using two constructs —
.loc() : location based
It has methods like scalar label, list of labels, slice object etc
.iloc() : Interger based
2/ Slicing data frames
In order to slice by labels you can use loc() attribute of the DataFrame.
1/ DefaultDict
In python, a dictionary is a container that holds key-value pairs. Keys must be unique, immutable objects. If you try to access or modify keys that don’t exist in the dictionary, it raise a KeyError & break up your code execution ( continued..)
2/ (Continued..)To tackle this issue,Python defaultdict type, a dictionary-like class is used.If you try to access or modify a missing key,then defaultdict will automatically create the key & generate a default value for it
A defaultdict will never raise a KeyError ( Continued..)