Logging training Running DummyClassifier() accuracy: 0.643 average_precision: 0.357 roc_auc: 0.500 recall_macro: 0.500 f1_macro: 0.392 === new best DummyClassifier() (using recall_macro): accuracy: 0.643 average_precision: 0.357 roc_auc: 0.500 recall_macro: 0.500 f1_macro: 0.392 Running GaussianNB() accuracy: 0.623 average_precision: 0.505 roc_auc: 0.590 recall_macro: 0.560 f1_macro: 0.549 === new best GaussianNB() (using recall_macro): accuracy: 0.623 average_precision: 0.505 roc_auc: 0.590 recall_macro: 0.560 f1_macro: 0.549 Running MultinomialNB() accuracy: 0.647 average_precision: 0.481 roc_auc: 0.609 recall_macro: 0.589 f1_macro: 0.588 === new best MultinomialNB() (using recall_macro): accuracy: 0.647 average_precision: 0.481 roc_auc: 0.609 recall_macro: 0.589 f1_macro: 0.588 Running DecisionTreeClassifier(class_weight='balanced', max_depth=1) accuracy: 0.586 average_precision: 0.401 roc_auc: 0.568 recall_macro: 0.568 f1_macro: 0.558 Running DecisionTreeClassifier(class_weight='balanced', max_depth=5) accuracy: 0.590 average_precision: 0.419 roc_auc: 0.564 recall_macro: 0.576 f1_macro: 0.560 Running DecisionTreeClassifier(class_weight='balanced', min_impurity_decrease=0.01) accuracy: 0.582 average_precision: 0.393 roc_auc: 0.563 recall_macro: 0.567 f1_macro: 0.555 Running LogisticRegression(C=0.1, class_weight='balanced', max_iter=1000) accuracy: 0.574 average_precision: 0.487 roc_auc: 0.425 recall_macro: 0.548 f1_macro: 0.547 Running LogisticRegression(class_weight='balanced', max_iter=1000) accuracy: 0.578 average_precision: 0.470 roc_auc: 0.437 recall_macro: 0.562 f1_macro: 0.557 Best model: Pipeline(steps=[('minmaxscaler', MinMaxScaler()), ('multinomialnb', MultinomialNB())]) Best Scores: accuracy: 0.647 average_precision: 0.481 roc_auc: 0.609 recall_macro: 0.589 f1_macro: 0.588