[TensorFlow 2.0] The Magic (Math) behind the Loss Functions

A Ydobon
1 min readOct 3, 2019

--

Until I finish writing up on this topic, refer to here.

In this post, I will explain all 15 different loss functions which are provided by the current TensorFlow 2.0 version. The loss function plays an important role in machine learning algorithms in the sense that it serves as the objective function. The main objective of a specific machine learning model can be very ambitious, however, from the math point of view, every machine learning algorithm is merely a minimization problem with a specified objective function (loss function).

The supervised learning algorithms are divided into two types of problems: regression vs. classification. So are the loss functions.

For the regression type problem, the loss functions are

  1. MSE
  2. MAE
  3. MSLE
  4. MAPE
  5. KLD
  6. Poisson
  7. Logcosh
  8. Cosine Similarity
  9. Huber

For the classification problem, the loss functions are

  1. binary cross-entropy
  2. categorical cross-entropy
  3. sparse categorical cross-entropy
  4. Hinge
  5. Squared Hinge
  6. categorical Hinge

--

--

A Ydobon
A Ydobon

No responses yet