Skip to content

Commit 7285787

Browse files
Fixed typo in metrics_prob_input (#455)
1 parent 01d4003 commit 7285787

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

docs/models.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -69,11 +69,11 @@ PyTorch Tabular also accepts custom loss functions (which are drop in replacemen
6969

7070
While the Loss functions drive the gradient based optimization, we keep track of the metrics that we care about during training. By default, PyTorch Tabular tracks Accuracy for classification and Mean Squared Error for regression. You can choose any functional metrics (as a list of strings) from [TorchMetrics](https://torchmetrics.readthedocs.io/en/stable/references/metric.html).
7171

72-
Some metrics need some parameters to work the way we expect it to. For eg. the averaging scheme for a multi-class f1 score. Such parameters can be fed in through `metrics_params`, which is a list of dictionaries holding the parameters for the metrics declared in the same order. In classification metrics, PyTorch Tabular also needs to know whether the metric defined expects the predicted class (like Accuracy) or the predicted probability (like ROCAUC). This can be specified by `metrics_prob_input` which is a list of `True` or `False`, one for each custom metric you define. Length of `metrics_params`, `metric_prob_input` and `metrics` should be the same.
72+
Some metrics need some parameters to work the way we expect it to. For eg. the averaging scheme for a multi-class f1 score. Such parameters can be fed in through `metrics_params`, which is a list of dictionaries holding the parameters for the metrics declared in the same order. In classification metrics, PyTorch Tabular also needs to know whether the metric defined expects the predicted class (like Accuracy) or the predicted probability (like ROCAUC). This can be specified by `metrics_prob_input` which is a list of `True` or `False`, one for each custom metric you define. Length of `metrics_params`, `metrics_prob_input` and `metrics` should be the same.
7373

7474
```python
7575
metrics = ["accuracy", "f1_score"]
76-
metric_prob_input = [False, False]
76+
metrics_prob_input = [False, False]
7777
metrics_params = [{}, {num_classes: 2}]
7878
```
7979

0 commit comments

Comments
 (0)