The Cost of Being Wrong: KL Divergence
If Cross-Entropy is the total cost of sending a message, KL Divergence is the unnecessary tax you pay for having a bad model.
Aricles, Stories, and Thoughts
This blog mainly serves as a record of what I learn and a reference for myself. I document the mistakes, screw-ups, and little (some huge) discoveries I make while coding and building things.
If Cross-Entropy is the total cost of sending a message, KL Divergence is the unnecessary tax you pay for having a bad model.
Hooks allow us to interrupt the dynamic computational graph created by PyTorch in the forward or backward pass.
When we train neural networks, we are essentially trying to lower our surprise about the data until our model's predictions align perfectly with reality.
Flower Image Classification using Transfer Learning