XGBoost is an efficient and scalable implementation of gradient boosting decision trees. The XGBClassifier is a classifier interface for XGBoost that can be used to perform binary classification, multi-class classification, and regression.
The get_params method of the XGBClassifier class can be used to obtain the current hyperparameter settings of the classifier. It returns a dictionary containing the current hyperparameter values, where the keys are the parameter names and the values are the current values of the parameters.
Example 1: To obtain the current hyperparameter settings of an XGBClassifier object named clf, we can use the following code:
params = clf.get_params()
print(params)
The output will be a dictionary with the current hyperparameter settings of the classifier.
Example 2: To set the n_estimators hyperparameter to a different value, we can use the set_params method:
clf.set_params(n_estimators=1000)
This sets the number of trees in the ensemble to 1000.
Package library: The XGBoost package can be imported using the following command:
import xgboost as xgb
This command imports the entire XGBoost package and makes all of its functionality available for use in your Python code.
Python XGBClassifier.get_params - 32 examples found. These are the top rated real world Python examples of xgboost.XGBClassifier.get_params extracted from open source projects. You can rate examples to help us improve the quality of examples.