import numpy as np from sklearn.preprocessing import PolynomialFeatures X = np.array([1, 2, 3]) poly = PolynomialFeatures(2) X_poly = poly.fit_transform(X.reshape(-1,1)) print(X_poly)
import numpy as np from sklearn.linear_model import LinearRegression from sklearn.preprocessing import PolynomialFeatures X = np.array([[1, 2], [3, 4], [5, 6]]) poly = PolynomialFeatures(2, interaction_only=True) X_poly = poly.fit_transform(X) print(X_poly) model = LinearRegression() model.fit(X_poly, [1, 3, 5]) print(model.coef_)In this example, we first create a 2D array of data with 2 features (columns) and 3 data points (rows). We then use PolynomialFeatures to create degree 2 polynomial features, but only the interaction terms between the two features (i.e. $x_1 x_2$). We then use this new data to train a linear regression model, which includes the interaction terms. Overall, PolynomialFeatures is a useful module in Python's Scikit-learn library for creating higher order polynomial features from existing data, which can improve the performance of machine learning models.