Decision Tree:
       Decision Trees  are a non-parametric supervised learning method used for classification and regression. The goal is to create a model that predicts the value of a target variable by learning simple decision rules inferred from the data features.  
Random Forest:
              Random forest is an ensemble learning method for classification, regression and other tasks, that operate by constructing a multitude of decision trees at training time and outputting the class that is the mode of the classes (classification) or mean prediction (regression) of the target.
XG Boost: 
             XGBoost is an optimized distributed gradient boosting library designed to be highly efficientflexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting .
Majority voting:
             Majority voting is an ensemble approach which combines different models( mode of the classes for classification and mean prediction for regression of the target.)
Stacking:
         Stacking (also called meta ensembling) is a model ensembling technique used to combine information from multiple predictive models to generate a new model.
Final test scores:
        
Model
Accuracy
Precision
F1 – score
Recall
Logistic Regression
47.5
51
48
48
Decision Tree
52
50
50
52
Random Forest
58
56
52
58
XG Boost
58
54
54
58
Majority Voting
52
53
50
52
Stacking
58
70
62
58