sckit-learn データセットを使った機械学習 metricsで評価編

正答率(Accuracy)

適合率(Precision)

再現率(Recall)

F値(F-measure)

混同行列(Confusion Matrix)

#metrics.accuracy_score(test_target, predicted)
print('accuracy:/n', metrics.accuracy_score(expected, predicted))
#metrics.precision_score(test_target, predicted, pos_label=#)
print('Precision:/n', metrics.precision_score(expected, predicted, pos_label=3))
[http://scikit-learn.org/stable/modules/generated/sklearn.metrics.f1_score.html:title]

print('F1:/n', metrics.f1_score(expected, predicted, pos_label=3))
[http://scikit-learn.org/stable/modules/generated/sklearn.metrics.recall_score.html#sklearn.metrics.recall_score:title]

print('Recall:/n', metrics.recall_score(expected, predicted, pos_label=3))
[http://scikit-learn.org/stable/modules/generated/sklearn.metrics.f1_score.html:title]

print('Confusion_matrix:/n', metrics.confusion_matrix(expected, predicted))

参照:
情報検索の評価についてメモ(適合率,再現率,F値)siguniang.wordpress.com

from sklearn.metrics import classification_report
print classification_report(label_test, predict)

             precision    recall  f1-score   support

          0       1.00      0.98      0.99        47
          1       0.91      0.85      0.88        47
          2       1.00      0.98      0.99        45
          3       0.94      1.00      0.97        46
          4       0.98      0.98      0.98        49
          5       0.90      0.98      0.94        54
          6       0.93      0.95      0.94        39
          7       1.00      0.98      0.99        41
          8       0.85      0.90      0.88        39
          9       0.97      0.86      0.91        43

avg / total       0.95      0.95      0.95       450