Hello Juan!

If you are using neural networks, which are the most common application for entropy as a loss function, you can use a variety of cross-entropy or KL divergence losses. Usually it doesn't make much sense to evaluate a model with entropy since it's more of a metric to help machines understand where they are, for a metric to evaluate the model on (and not to train it with) using something simple and interpretable like accuracy is best.

Hope this helps!

--

--

ML enthusiast. Get my book: https://bit.ly/modern-dl-book. Join Medium through my referral link: https://andre-ye.medium.com/membership.

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store