Skip to content

Commit d2a416c

Browse files
mdrevestf-model-analysis-team
authored and
tf-model-analysis-team
committed
Add FAQ on the need for binarization settings when a binary classification model outputs two classes.
PiperOrigin-RevId: 390708473
1 parent e879b7a commit d2a416c

File tree

1 file changed

+24
-0
lines changed

1 file changed

+24
-0
lines changed

g3doc/faq.md

+24
Original file line numberDiff line numberDiff line change
@@ -420,6 +420,30 @@ implies FN must be `(N - TP)` or `N = TP + FN`. The end result is
420420
`precision@1 = TP / N = recall@1`. Note that this only applies when there is a
421421
single label per example, not for multi-label.
422422

423+
### Why are my mean_label and mean_prediction metrics always 0.5?
424+
425+
This is most likely caused because the metrics are configured for a binary
426+
classification problem, but the model is outputing probabilities for both of the
427+
classes instead of just one. This is common when
428+
[tensorflow's classification API](https://www.tensorflow.org/tfx/serving/signature_defs#classification_signaturedef)
429+
is used. The solution is to choose the class that you would like the predictions
430+
to be based on and then binarize on that class. For example:
431+
432+
```python
433+
eval_config = text_format.Parse("""
434+
...
435+
metrics_specs {
436+
binarize { class_ids: { values: [0] } }
437+
metrics { class_name: "MeanLabel" }
438+
metrics { class_name: "MeanPrediction" }
439+
...
440+
}
441+
...
442+
""", config.EvalConfig())
443+
```
444+
445+
Note: This applies to all metrics not just `mean_label` and `mean_prediction`.
446+
423447
### How to interpret the MultiLabelConfusionMatrixPlot?
424448

425449
Given a particular label, the `MultiLabelConfusionMatrixPlot` (and associated

0 commit comments

Comments
 (0)