You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This classification example uses the [cross-entropy](https://github.com/bpesquet/mlcourse/tree/main/lectures/classification_performance#assessing-performance-during-training-1) a.k.a. negative log-likelihood loss function, implemented by the [CrossEntropyLoss](https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html) class.
135
+
This multiclass classification example uses the [cross-entropy](https://github.com/bpesquet/mlcourse/tree/main/lectures/classification_performance#assessing-performance-during-training-1) a.k.a. negative log-likelihood loss function, implemented by the [CrossEntropyLoss](https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html) class.
136
136
137
137
> [!NOTE]
138
138
> PyTorch also offers the [NLLLoss](https://pytorch.org/docs/stable/generated/torch.nn.NLLLoss.html#torch.nn.NLLLoss) class implementing the negative log-likelihood loss. A key difference is that `CrossEntropyLoss` expects *logits* (raw, unnormalized predictions) as inputs, and uses [LogSoftmax](https://pytorch.org/docs/stable/generated/torch.nn.LogSoftmax.html#torch.nn.LogSoftmax) to transform them into probabilities before computing its output. Using `CrossEntropyLoss` is equivalent to applying `LogSoftmax` followed by `NLLLoss` ([more details](https://towardsdatascience.com/cross-entropy-negative-log-likelihood-and-all-that-jazz-47a95bd2e81)).
139
139
140
140
```python
141
-
# Use cross-entropy loss function.
141
+
# Use cross-entropy loss function for this multiclass classification task.
142
142
# Softmax is computed internally to convert outputs into probabilities
143
143
criterion = nn.CrossEntropyLoss()
144
144
```
@@ -176,7 +176,7 @@ for epoch in range(n_epochs):
176
176
n_correct =0
177
177
178
178
# For each batch of data
179
-
for x_batch, y_batch inblobs_dataloader:
179
+
for x_batch, y_batch intrain_dataloader:
180
180
# Forward pass
181
181
y_pred = model(x_batch)
182
182
@@ -310,7 +310,6 @@ def plot_decision_boundaries(model, x, y, title, device):
For this binary classification task, we use the [binary cross-entropy](https://github.com/bpesquet/mlcourse/tree/main/lectures/classification_performance#choosing-a-loss-function) loss function, implemented bye the PyTorch [BCELoss](https://pytorch.org/docs/stable/generated/torch.nn.BCELoss.html) class.
152
152
153
153
```python
154
-
# Use binary cross-entropy loss function
154
+
# Use binary cross-entropy loss function for this binary classification task
0 commit comments