tensorflow::ops::SoftmaxCrossEntropyWithLogits

#include <nn_ops.h>

Computes softmax cross entropy cost and gradients to backpropagate.

Summary

Inputs are the logits, not probabilities.

Arguments:

  • scope: A Scope object
  • features: batch_size x num_classes matrix
  • labels: batch_size x num_classes matrix The caller must ensure that each batch of labels represents a valid probability distribution.

Returns:

  • Output loss: Per example loss (batch_size vector).
  • Output backprop: backpropagated gradients (batch_size x num_classes matrix).
Constructors and Destructors
SoftmaxCrossEntropyWithLogits(const ::tensorflow::Scope & scope, ::tensorflow::Input features, ::tensorflow::Input labels)
Public attributes
backprop
loss
operation

Public attributes

backprop

::tensorflow::Output backprop

loss

::tensorflow::Output loss

operation

Operation operation

Public functions

SoftmaxCrossEntropyWithLogits

 SoftmaxCrossEntropyWithLogits(
  const ::tensorflow::Scope & scope,
  ::tensorflow::Input features,
  ::tensorflow::Input labels
)

© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 4.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r2.4/api_docs/cc/class/tensorflow/ops/softmax-cross-entropy-with-logits