from sklearn.ensemble import VotingClassifier from sklearn.tree import DecisionTreeClassifier from sklearn.linear_model import LogisticRegression from sklearn.naive_bayes import GaussianNB clf1 = DecisionTreeClassifier(max_depth=4) clf2 = LogisticRegression(solver='liblinear') clf3 = GaussianNB() voting_clf = VotingClassifier(estimators=[('dt', clf1), ('lr', clf2), ('nb', clf3)]) voting_clf.fit(X_train, y_train)
from sklearn.ensemble import VotingClassifier from sklearn.svm import SVC from sklearn.neighbors import KNeighborsClassifier clf1 = SVC(kernel='linear', probability=True) clf2 = KNeighborsClassifier(n_neighbors=5) voting_clf = VotingClassifier(estimators=[('svm', clf1), ('knn', clf2)], voting='soft', weights=[2, 1]) voting_clf.fit(X_train, y_train)In this example, we use two different classification models - a Support Vector Machine classifier and a k-Nearest Neighbors classifier. We pass these classifiers to the VotingClassifier class and set the voting parameter to 'soft', which means that the model will use the predicted probabilities of each class from each individual classifier for the majority voting. We also set the weights parameter to [2,1], which means that the SVM classifier will have twice the influence of the KNN classifier in the voting process. These examples demonstrate how to use the sklearn.ensemble VotingClassifier class to combine different classification models and make predictions based on majority voting. The package library used in these examples is scikit-learn.