sigmoid_cross_entropy_with_logits

This is a function of Tensorflow version 2 which is used for soft binary labels. A soft label is one that has a measure of likelihood. This function can also be used for hard labels. It measures the probability of error in tasks that have two outcomes.

Python3




import tensorflow as tf
  
# type list
input = [1., 2., 3., 4., 5.89]
output = [2, 1, 3, 4, 5.9]
  
# conversion to tensor
# input means actual
input = tf.convert_to_tensor(input,
                             dtype=tf.float32)
# Output means predicted
output = tf.convert_to_tensor(output,
                              dtype=tf.float32)
  
# calculating the deviation between
# actual and predicted values
x = tf.nn.sigmoid_cross_entropy_with_logits(
    labels=output, logits=input).numpy()
  
print(x)


Output:

[ -0.68673825   0.12692802  -5.9514127  -11.98185    -28.858236  ]

We can also compute sigmoid cross entropy loss between two 4D tensors using the sigmoid_cross_entropy_with_logits() function.

Python3




import tensorflow as tf
  
# type list
input = [[[[9], [8]], [[7], [5]]]]
output = [[[[1], [2]], [[3], [4]]]]
  
# conversion to tensor
input = tf.convert_to_tensor(input,
                             dtype=tf.float32)
output = tf.convert_to_tensor(output,
                              dtype=tf.float32)
  
# calculating deviation in actual and predicted values
x = tf.nn.sigmoid_cross_entropy_with_logits(
    labels=output, logits=input).numpy()
  
print(x)


Output:

[[[[ 1.2340219e-04]
   [-7.9996648e+00]]
   
  [[-1.3999088e+01]
   [-1.4993284e+01]]]]

Sigmoid Cross Entropy function of TensorFlow

TensorFlow is an open-source Machine Learning framework used to develop models. While developing models we use many functions to check the model’s accuracy and loss. For a model to be in good condition, loss calculation is a must since loss acts like a penalty. The lower the loss better will be the working of the model.  

There are many kinds of loss functions. One such function is the Sigmoid cross entropy function of TensorFlow. The sigmoid function or logistic function is the function that generates an S-shaped curve. This function is used to predict probabilities therefore, the range of this function lies between 0 and 1.

Cross Entropy loss is the difference between the actual and the expected outputs. This is also known as the log loss function and is one of the most valuable techniques in the field of Machine Learning. 

Similar Reads

sigmoid_cross_entropy_with_logits

This is a function of Tensorflow version 2 which is used for soft binary labels. A soft label is one that has a measure of likelihood. This function can also be used for hard labels. It measures the probability of error in tasks that have two outcomes....

Sigmoid Cross Entropy

...

Contact Us