Monet Art generation using Cycle GANs
The goal of this project is to explore the problem of training style transfer models for image generators. We also want to explore image generation itself and create the most accurate generated images that we can. We are doing so by exploring and testing a Generative Adversarial Network architecture called CycleGANs as the primary method for training style transfer models



Model Architecture


The downsampling layers consists of 3 Conv2d layers that increases the input channel size from 3 to 64 in the first convolutional layer and 64 to 128 in the second convolutional layer and 128 to 256 in the third convolutional layer. Each convolutional layer was followed by a ReLU or a LeakyReLU activation functions to give the model nonlinearity. The ReLU layers where preferred due to their ability to prevent vanishing gradient and LeakyReLU to handle negative inputs as well. This is essential as the images are normalized before loading it to the model. Normalization techniques applied on the images include subtracting each channel in the image by 0.5 and dividing by 0.5. This helps in having the pixel values range from [-1,1]. We added a reflectionpad layer of size 1 on each side to increase the receptive field.
After that we added upsampling layers which consists of two ConvTranspose2d layers that decreases the input channel size from 256 to 128 in the first layer and 128 to 64 in the second layer. Finally a convolutional layer has been added to change the channel size from 64 to 3 which is equal to the number of channels in the original image. The dropout layers were added during downsampling and upsampling to regularize the values in the tensors. Activations functions such as ReLU(), LeakyReLU() and Tanh() functions were added to provide non-linearity. This model helped us reduce our loss and saw improvements which were more stable, but the loss was still fairly high and the output were not in the realm we wanted them to be in. We initially used Adagrad as our optimizer for this training session and we decided to try RMSProp next.