You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/04-supervised-ML-classification.rst
+17-12Lines changed: 17 additions & 12 deletions
Original file line number
Diff line number
Diff line change
@@ -319,6 +319,7 @@ Despite its name, logistic regression is not a regression algorithm but a classi
319
319
For binary classification, it uses the logistic (**sigmoid**) function to map a linear combination of input features to a probability between 0 and 1, which is then thresholded (typically at 0.5) to assign a class.
320
320
321
321
For a multiclass classification, logistic regression can be extended using strategies like **one-vs-rest** (OvR) or softmax regression.
322
+
322
323
- in OvR, a separate binary classifier is trained for each species against all others.
323
324
- **softmax regression** generalizes the logistic function to compute probabilities across all classes simultaneously, selecting the class with the highest probability.
324
325
@@ -328,25 +329,29 @@ For a multiclass classification, logistic regression can be extended using strat
328
329
329
330
1) The sigmoid function; 2) the softmax regression process: three input features to the softmax regression model resulting in three output vectors where each contains the predicted probabilities for three possible classes; 3) a bar chart of softmax outputs in which each group of bars represents the predicted probability distribution over three classes; 4-6) a binary classifier distinguishes one class from the other two classes using the one-vs-rest approach.
330
331
332
+
The creation of a Logistic Regression model and the process of fitting it to the training data are nearly identical to those used for the KNN model described above, except that a different classifier is selected. The code example and the resulting confusion matrix plot are provided below:
333
+
334
+
.. code-block:: python
331
335
332
-
```
333
-
from sklearn.linear_model import LogisticRegression
336
+
from sklearn.linear_model import LogisticRegression
334
337
335
-
lr_clf = LogisticRegression(random_state = 0)
336
-
lr_clf.fit(X_train_scaled, y_train)
338
+
lr_clf = LogisticRegression(random_state=0)
339
+
lr_clf.fit(X_train_scaled, y_train)
337
340
338
-
y_pred_lr = lr_clf.predict(X_test_scaled)
341
+
y_pred_lr = lr_clf.predict(X_test_scaled)
339
342
340
-
score_lr = accuracy_score(y_test, y_pred_lr)
341
-
print("Accuracy for Logistic Regression:", score_lr )
0 commit comments