A Gentle Introduction to Cross-Entropy for Machine Learning

Last Updated on December 22, 2020 Cross-entropy is commonly used in machine learning as a loss function. Cross-entropy is a measure from the field of information theory, building upon entropy and generally calculating the difference between two probability distributions. It is closely related to but is different from KL divergence that calculates the relative entropy … Continue reading A Gentle Introduction to Cross-Entropy for Machine Learning