sklearn.metrics.classification_report¶
- sklearn.metrics.classification_report(y_true, y_pred, labels=None, target_names=None)¶
Build a text report showing the main classification metrics
Parameters : y_true : array-like or list of labels or label indicator matrix
Ground truth (correct) target values.
y_pred : array-like or list of labels or label indicator matrix
Estimated targets as returned by a classifier.
labels : array, shape = [n_labels]
Optional list of label indices to include in the report.
target_names : list of strings
Optional display names matching the labels (same order).
Returns : report : string
Text summary of the precision, recall, F1 score for each class.
Examples
>>> from sklearn.metrics import classification_report >>> y_true = [0, 1, 2, 2, 2] >>> y_pred = [0, 0, 2, 2, 1] >>> target_names = ['class 0', 'class 1', 'class 2'] >>> print(classification_report(y_true, y_pred, target_names=target_names)) precision recall f1-score support class 0 0.50 1.00 0.67 1 class 1 0.00 0.00 0.00 1 class 2 1.00 0.67 0.80 3 avg / total 0.70 0.60 0.61 5