Skip to content

Commit

Permalink
Updated project with new features
Browse files Browse the repository at this point in the history
  • Loading branch information
alimovabdulla committed Jan 30, 2025
0 parents commit 27a8174
Show file tree
Hide file tree
Showing 22 changed files with 675 additions and 0 deletions.
32 changes: 32 additions & 0 deletions .github/workflows/python-publish.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
name: Publish Python package

on:
push:
branches:
- main # Yalnız main branch-ə push ediləndə işləyəcək

jobs:
build:
runs-on: ubuntu-latest

steps:
- name: Checkout code
uses: actions/checkout@v2

- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.9 # Python versiyasını uyğun olaraq seçin

- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install setuptools wheel twine
- name: Build distribution
run: |
python setup.py sdist bdist_wheel
- name: Upload to PyPI
run: |
python -m twine upload dist/* -u __token__ -p ${{ secrets.PYPI_API_TOKEN }}
10 changes: 10 additions & 0 deletions CHANGELOG.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# Changelog

## [1.0.0] - 2025-01-30
- Initial release with support for hyperparameter tuning of 20+ classifiers.
- Implemented GridSearchCV for model evaluation and selection.
- Added the ability to pass custom parameters for model tuning.
- Cross-validation support integrated.

## [Unreleased]
- Future improvements and features will be added in upcoming versions.
39 changes: 39 additions & 0 deletions GridSearchHelper.egg-info/PKG-INFO
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
Metadata-Version: 2.2
Name: GridSearchHelper
Version: 0.2.0
Summary: A library for hyperparameter tuning using grid search for machine learning models.
Home-page: https://github.com/username/ModelTuner
Author: Abdulla Alimov
Author-email: [email protected]
Classifier: Programming Language :: Python :: 3
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Requires-Python: >=3.6
Description-Content-Type: text/markdown
License-File: LICENCE.txt
Requires-Dist: scikit-learn>=0.24.0
Requires-Dist: numpy>=1.19.0
Dynamic: author
Dynamic: author-email
Dynamic: classifier
Dynamic: description
Dynamic: description-content-type
Dynamic: home-page
Dynamic: requires-dist
Dynamic: requires-python
Dynamic: summary

# Hyperparameter Tuning for Classifiers

This project implements a hyperparameter tuning utility for multiple classifiers using `GridSearchCV` from `sklearn`. The supported classifiers include Random Forest, Gradient Boosting, AdaBoost, SVM, K-Nearest Neighbors, Logistic Regression, Decision Trees, Naive Bayes, MLP, and more.

## Features
- Grid search for hyperparameter optimization on a variety of models.
- Support for additional custom parameters.
- Cross-validation (CV) support for model evaluation.
- Parallel processing for faster results.

## Setup
1. Clone this repository:
```bash
git clone <repository_url>
14 changes: 14 additions & 0 deletions GridSearchHelper.egg-info/SOURCES.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
CHANGELOG.txt
LICENCE.txt
MANIFEST.in
README.md
setup.py
GridSearchHelper/__init__.py
GridSearchHelper/grid_search.py
GridSearchHelper/models.py
GridSearchHelper.egg-info/PKG-INFO
GridSearchHelper.egg-info/SOURCES.txt
GridSearchHelper.egg-info/dependency_links.txt
GridSearchHelper.egg-info/not-zip-safe
GridSearchHelper.egg-info/requires.txt
GridSearchHelper.egg-info/top_level.txt
1 change: 1 addition & 0 deletions GridSearchHelper.egg-info/dependency_links.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@

1 change: 1 addition & 0 deletions GridSearchHelper.egg-info/not-zip-safe
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@

2 changes: 2 additions & 0 deletions GridSearchHelper.egg-info/requires.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
scikit-learn>=0.24.0
numpy>=1.19.0
1 change: 1 addition & 0 deletions GridSearchHelper.egg-info/top_level.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
GridSearchHelper
26 changes: 26 additions & 0 deletions GridSearchHelper/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# __init__.py

from .models import (
RandomForestClassifier,
GradientBoostingClassifier,
AdaBoostClassifier,
ExtraTreesClassifier,
BaggingClassifier,
HistGradientBoostingClassifier,
SVC,
LinearSVC,
KNeighborsClassifier,
LogisticRegression,
RidgeClassifier,
SGDClassifier,
PassiveAggressiveClassifier,
DecisionTreeClassifier,
GaussianNB,
BernoulliNB,
MultinomialNB,
MLPClassifier,
LinearDiscriminantAnalysis,
QuadraticDiscriminantAnalysis,
)

from .grid_search import get_param_grid, perform_grid_search
194 changes: 194 additions & 0 deletions GridSearchHelper/grid_search.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,194 @@
# grid_search.py

from sklearn.model_selection import GridSearchCV
from typing import Dict, Optional, Union, Tuple, Any
import numpy as np
from numpy.typing import ArrayLike

def get_param_grid(model_name: str, additional_params: Optional[Dict[str, Union[int, float, list]]] = None) -> Dict[str, list]:
"""
Seçilən modelə uyğun hiperparametr gridini qaytarır.
Args:
model_name (str): Modelin adı.
additional_params (dict, optional): Əlavə parametrlər.
Returns:
dict: Parametrlər gridini qaytarır.
"""
param_grids = {
'RandomForest': {
'n_estimators': [50, 100, 200, 300],
'max_depth': [5, 10, 15, 20, None],
'min_samples_split': [2, 5, 10],
'min_samples_leaf': [1, 2, 4],
'max_features': ['sqrt', 'log2', None],
'bootstrap': [True, False]
},
'GradientBoosting': {
'n_estimators': [50, 100, 200],
'learning_rate': [0.01, 0.05, 0.1, 0.2],
'max_depth': [3, 5, 7, 9],
'subsample': [0.8, 0.9, 1.0],
'min_samples_split': [2, 5, 10],
'min_samples_leaf': [1, 2, 4]
},
'HistGradientBoosting': {
'max_iter': [50, 100, 200],
'learning_rate': [0.01, 0.1, 0.2],
'max_depth': [3, 5, 7],
'min_samples_leaf': [1, 5, 20],
'l2_regularization': [0, 1.0, 10.0]
},
'AdaBoost': {
'n_estimators': [50, 100, 200],
'learning_rate': [0.01, 0.1, 1.0],
'algorithm': ['SAMME', 'SAMME.R']
},
'ExtraTrees': {
'n_estimators': [50, 100, 200],
'max_depth': [5, 10, 15, None],
'min_samples_split': [2, 5, 10],
'min_samples_leaf': [1, 2, 4],
'max_features': ['sqrt', 'log2', None]
},
'Bagging': {
'n_estimators': [10, 30, 50],
'max_samples': [0.5, 0.7, 1.0],
'max_features': [0.5, 0.7, 1.0],
'bootstrap': [True, False],
'bootstrap_features': [True, False]
},
'SVC': {
'C': [0.1, 1, 10, 100],
'kernel': ['linear', 'rbf', 'poly', 'sigmoid'],
'gamma': ['scale', 'auto'],
'degree': [2, 3, 4],
'coef0': [0.0, 0.1, 0.5]
},
'LinearSVC': {
'C': [0.1, 1, 10],
'penalty': ['l1', 'l2'],
'dual': [True, False],
'max_iter': [1000, 2000, 5000]
},
'KNeighbors': {
'n_neighbors': [3, 5, 7, 9, 11],
'weights': ['uniform', 'distance'],
'metric': ['euclidean', 'manhattan', 'minkowski'],
'p': [1, 2],
'leaf_size': [10, 30, 50]
},
'LogisticRegression': {
'C': [0.001, 0.01, 0.1, 1, 10],
'penalty': ['l1', 'l2', 'elasticnet', None],
'solver': ['lbfgs', 'liblinear', 'newton-cg', 'sag', 'saga'],
'max_iter': [1000, 2000, 5000],
'l1_ratio': [0.2, 0.5, 0.8]
},
'RidgeClassifier': {
'alpha': [0.1, 1.0, 10.0],
'solver': ['auto', 'svd', 'cholesky', 'sparse_cg'],
'max_iter': [None, 1000, 2000]
},
'SGDClassifier': {
'loss': ['hinge', 'log_loss', 'modified_huber'],
'penalty': ['l1', 'l2', 'elasticnet'],
'alpha': [0.0001, 0.001, 0.01],
'max_iter': [1000, 2000, 5000],
'learning_rate': ['constant', 'optimal', 'adaptive']
},
'PassiveAggressive': {
'C': [0.1, 1.0, 10.0],
'max_iter': [1000, 2000, 5000],
'early_stopping': [True, False],
'validation_fraction': [0.1, 0.2]
},
'DecisionTree': {
'max_depth': [5, 10, 15, 20, None],
'min_samples_split': [2, 5, 10],
'min_samples_leaf': [1, 2, 4],
'max_features': ['sqrt', 'log2', None],
'criterion': ['gini', 'entropy']
},
'GaussianNB': {
'var_smoothing': [1e-9, 1e-8, 1e-7, 1e-6]
},
'BernoulliNB': {
'alpha': [0.1, 0.5, 1.0],
'binarize': [0.0, 0.5, None],
'fit_prior': [True, False]
},
'MultinomialNB': {
'alpha': [0.1, 0.5, 1.0],
'fit_prior': [True, False]
},
'MLPClassifier': {
'hidden_layer_sizes': [(50,), (100,), (50, 50), (100, 50)],
'activation': ['relu', 'tanh'],
'solver': ['adam', 'sgd'],
'alpha': [0.0001, 0.001, 0.01],
'learning_rate': ['constant', 'adaptive'],
'max_iter': [1000, 2000]
},
'LinearDiscriminantAnalysis': {
'solver': ['svd', 'lsqr', 'eigen'],
'shrinkage': [None, 'auto', 0.1, 0.5, 0.9]
},
'QuadraticDiscriminantAnalysis': {
'reg_param': [0.0, 0.1, 0.2],
'tol': [1e-4, 1e-3, 1e-2]
}
}

if additional_params:
if model_name not in param_grids:
raise ValueError(f"Model '{model_name}' üçün parametrlər tapılmadı.")
param_grids[model_name].update(additional_params)

if model_name not in param_grids:
raise ValueError(f"Model adı '{model_name}' düzgün deyil. Mövcud modellər: {', '.join(param_grids.keys())}.")

return param_grids[model_name]

def perform_grid_search(
model_name: str,
X_train: ArrayLike,
y_train: ArrayLike,
additional_params: Optional[Dict[str, Any]] = None,
cv_folds: int = 5,
scoring: Optional[Union[str, callable]] = None,
verbose: int = 0,
n_jobs: int = -1
) -> Tuple[Dict[str, Any], float, GridSearchCV]:
"""
Seçilən model adı ilə GridSearchCV tətbiq edir və ən yaxşı hiperparametrləri tapır.
Args:
model_name (str): Modelin adı.
X_train (array-like): Təlim verilənlərinin xüsusiyyətləri.
y_train (array-like): Təlim verilənlərinin hədəf dəyişəni.
additional_params (dict, optional): Əlavə parametrlər.
cv_folds (int): Cross-validation üçün fold sayı.
scoring (str or callable, optional): Qiymətləndirmə metrikası.
verbose (int): Əlavə məlumatların çap edilməsi səviyyəsi.
n_jobs (int): Paralel işləmə üçün prosessor nüvələri sayı.
Returns:
tuple: Ən yaxşı parametrlər, ən yaxşı skor və GridSearchCV obyekti.
"""
model = globals()[model_name]() # Model adı ilə müvafiq modeli çağırır
param_grid = get_param_grid(model_name, additional_params)

grid_search = GridSearchCV(
estimator=model,
param_grid=param_grid,
cv=cv_folds,
scoring=scoring,
verbose=verbose,
n_jobs=n_jobs
)

grid_search.fit(X_train, y_train)

return grid_search.best_params_, grid_search.best_score_, grid_search
25 changes: 25 additions & 0 deletions GridSearchHelper/models.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# models.py

from sklearn.ensemble import (
RandomForestClassifier,
GradientBoostingClassifier,
AdaBoostClassifier,
ExtraTreesClassifier,
BaggingClassifier,
HistGradientBoostingClassifier,
)
from sklearn.svm import SVC, LinearSVC
from sklearn.neighbors import KNeighborsClassifier
from sklearn.linear_model import (
LogisticRegression,
RidgeClassifier,
SGDClassifier,
PassiveAggressiveClassifier,
)
from sklearn.tree import DecisionTreeClassifier
from sklearn.naive_bayes import GaussianNB, BernoulliNB, MultinomialNB
from sklearn.neural_network import MLPClassifier
from sklearn.discriminant_analysis import (
LinearDiscriminantAnalysis,
QuadraticDiscriminantAnalysis,
)
24 changes: 24 additions & 0 deletions LICENCE.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@

### `LICENSE`
```text
MIT License

Copyright (c) 2025

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
Loading

0 comments on commit 27a8174

Please sign in to comment.