Question: In Google Colab's Python create a grid search program to optimize the inputs to create the best score for accuracy using the code below. Assume
In Google Colab's Python create a grid search program to optimize the inputs to create the best score for accuracy using the code below. Assume this has already gone through dimensionality reduction. The data is not given. The question asks for structure for a grid search, not necessarily the entire analysis. All that is needed is to know that it is logistic, values 0 or 1.
train_cols = ["feature1", "feature2", "feature3", "feature4", "feature5", "feature6", "feature7", "feature8", "feature9", "feature10"]
X, y = df[train_cols], df["y"]
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=1000, shuffle=False)
print("Train--Test size", len(y_train), len(y_test))
print(X_train.shape)
print(X_test.shape)
#CREATE A GRID SEARCH TO OPTIMIZE THE MODEL HERE
log_reg = LogisticRegression()
log_reg.fit(X_train, y_train)
y_pred = log_reg.predict(X_test)
accuracy_score(y_test,y_pred)
print(confusion_matrix(y_test, y_pred))
print("Accuracy is: ", accuracy_score(y_test, y_pred))
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
