from sklearn.ensemble import VotingClassifier from sklearn.linear_model import LogisticRegression from sklearn.tree import DecisionTreeClassifier lr = LogisticRegression() dt = DecisionTreeClassifier() # instantiate the ensemble classifier ensemble = VotingClassifier(estimators=[('lr', lr), ('dt', dt)], voting='soft') # set the parameters using set_params ensemble.set_params(voting='hard') # fit and predict ensemble.fit(X_train, y_train) ensemble.predict(X_test)
from sklearn.ensemble import VotingClassifier from sklearn.linear_model import LogisticRegression from sklearn.tree import DecisionTreeClassifier from sklearn.model_selection import GridSearchCV lr = LogisticRegression() dt = DecisionTreeClassifier() param_grid = {'lr__C': [0.1, 1.0, 10.0], 'dt__max_depth': [3, 4, 5]} # instantiate the ensemble classifier ensemble = VotingClassifier(estimators=[('lr', lr), ('dt', dt)], voting='soft') # perform grid search with cross-validation grid = GridSearchCV(ensemble, param_grid=param_grid, cv=5) grid.fit(X_train, y_train) # print best hyperparameters and score print("Best parameters:", grid.best_params_) print("Best score:", grid.best_score_)In this example, we use the VotingClassifier within a grid search to find the best hyperparameters for the ensemble classifier. We create instances of LogisticRegression and DecisionTreeClassifier and pass them as estimators to the VotingClassifier instance named `ensemble`, setting the voting parameter to 'soft'. We then create a dictionary of hyperparameters for the classifiers and use it to define a parameter grid for the grid search. We perform the grid search with cross-validation using the GridSearchCV module, which takes the ensemble classifier instance and the parameter grid as arguments. Finally, we print the best hyperparameters and score determined by the grid search. Both examples use the sklearn.ensemble package library.