You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
HPOBench is a library for hyperparameter optimization and black-box optimization benchmark with a focus on reproducibility.
3
+
HPOBench is a library for providing benchmarks for (multi-fidelity) hyperparameter optimization and with a focus on reproducibility.
4
4
5
-
**Note:** HPOBench is under active construction. Stay tuned for more benchmarks. Information on how to contribute a new benchmark will follow shortly.
5
+
A list of benchmarks can be found in the [wiki](https://github.com/automl/HPOBench/wiki/Available-Containerized-Benchmarks) and a guide on howto contribute benchmarks is avaiable [here](https://github.com/automl/HPOBench/wiki/https://github.com/automl/HPOBench/wiki/How-to-add-a-new-benchmark-step-by-step)
6
6
7
-
**Note:** If you are looking for a different or older version of our benchmarking library, you might be looking for
Containerized benchmarks do not rely on external dependencies and thus do not change. To do so, we rely on [Singularity (version 3.5)](https://sylabs.io/guides/3.5/user-guide/).
31
-
32
-
Further requirements are: [ConfigSpace](https://github.com/automl/ConfigSpace), *scipy* and *numpy*
33
-
34
-
**Note:** Each benchmark can also be run locally, but the dependencies must be installed manually and might conflict with other benchmarks.
35
-
This can be arbitrarily complex and further information can be found in the docstring of the benchmark.
36
-
37
-
A simple example is the XGBoost benchmark which can be installed with `pip install .[xgboost]`
38
-
```python
39
-
from hpobench.benchmarks.ml.xgboost_benchmark import XGBoostBenchmark
Before we start, we recommend using a virtual environment. To run any benchmark using its singularity container,
49
-
run the following:
41
+
We recommend using a virtual environment. To install HPOBench, please run the following:
50
42
```
51
43
git clone https://github.com/automl/HPOBench.git
52
44
cd HPOBench
53
45
pip install .
54
46
```
55
47
56
-
**Note:** This does not install *singularity (version 3.5)*. Please follow the steps described here: [user-guide](https://sylabs.io/guides/3.5/user-guide/quick_start.html#quick-installation-steps).
48
+
**Note:** This does not install *singularity (version 3.6)*. Please follow the steps described here: [user-guide](https://sylabs.io/guides/3.6/user-guide/quick_start.html#quick-installation-steps).
49
+
If you run into problems, using the most recent singularity version might help: [here](https://singularity.hpcng.org/admin-docs/master/installation.html)
57
50
58
-
## Available Containerized Benchmarks
51
+
## Containerized Benchmarks
59
52
60
-
| Benchmark Name | Container Name | Additional Info |
| BNNOn*| pybnn | There are 4 benchmark in total (ToyFunction, BostonHousing, ProteinStructure, YearPrediction) |
63
-
| CartpoleFull | cartpole | Not deterministic. |
64
-
| CartpoleReduced | cartpole | Not deterministic. |
65
-
| SliceLocalizationBenchmark | tabular_benchmarks | Loading may take several minutes. |
66
-
| ProteinStructureBenchmark | tabular_benchmarks | Loading may take several minutes. |
67
-
| NavalPropulsionBenchmark | tabular_benchmarks | Loading may take several minutes. |
68
-
| ParkinsonsTelemonitoringBenchmark | tabular_benchmarks | Loading may take several minutes. |
69
-
| NASCifar10*Benchmark | nasbench_101 | Loading may take several minutes. There are 3 benchmark in total (A, B, C) |
70
-
|*NasBench201Benchmark | nasbench_201 | Loading may take several minutes. There are 3 benchmarks in total (Cifar10Valid, Cifar100, ImageNet) |
71
-
| NASBench1shot1SearchSpace*Benchmark | nasbench_1shot1 | Loading may take several minutes. There are 3 benchmarks in total (1,2,3) |
72
-
| ParamNet*OnStepsBenchmark | paramnet | There are 6 benchmarks in total (Adult, Higgs, Letter, Mnist, Optdigits, Poker) |
73
-
| ParamNet*OnTimeBenchmark | paramnet | There are 6 benchmarks in total (Adult, Higgs, Letter, Mnist, Optdigits, Poker) |
74
-
| SurrogateSVMBenchmark | surrogate_svm | Random Forest Surrogate of a SVM on MNIST |
75
-
| Learna⁺ | learna_benchmark | Not deterministic. |
76
-
| MetaLearna⁺ | learna_benchmark | Not deterministic. |
77
-
| XGBoostBenchmark⁺ | xgboost_benchmark | Works with OpenML task ids. |
78
-
| XGBoostExtendedBenchmark⁺ | xgboost_benchmark | Works with OpenML task ids + Contains Additional Parameter `Booster |
79
-
| SupportVectorMachine⁺ | svm_benchmark | Works with OpenML task ids. |
53
+
We provide all benchmarks as containerized versions to (i) isolate their dependencies and (ii) keep them reproducible. Our containerized benchmarks do not rely on external dependencies and thus do not change over time. For this, we rely on [Singularity (version 3.6)](https://sylabs.io/guides/3.6/user-guide/) and for now upload all containers to a [gitlab registry](https://gitlab.tf.uni-freiburg.de/muelleph/hpobench-registry/container_registry)
80
54
81
-
⁺ these benchmarks are not yet final and might change
55
+
The only other requirements are: [ConfigSpace](https://github.com/automl/ConfigSpace), *scipy*and *numpy*
82
56
83
-
**Note:** All containers are uploaded [here](https://gitlab.tf.uni-freiburg.de/muelleph/hpobench-registry/container_registry)
57
+
### Run a Benchmark Locally
84
58
85
-
## Further Notes
86
-
87
-
### Configure the HPOBench
59
+
Each benchmark can also be run locally, but the dependencies must be installed manually and might conflict with other benchmarks. This can be arbitrarily complex and further information can be found in the docstring of the benchmark.
60
+
61
+
A simple example is the XGBoost benchmark which can be installed with `pip install .[xgboost]`
88
62
89
-
All of HPOBench's settings are stored in a file, the `hpobenchrc`-file.
90
-
It is a yaml file, which is automatically generated at the first use of HPOBench.
91
-
By default, it is placed in `$XDG_CONFIG_HOME`. If `$XDG_CONFIG_HOME` is not set, then the
92
-
`hpobenchrc`-file is saved to `'~/.config/hpobench'`. When using the containerized benchmarks, the Unix socket is
93
-
defined via `$TEMP_DIR`. This is by default `\tmp`. Make sure to have write permissions in those directories.
63
+
```python
64
+
from hpobench.benchmarks.ml.xgboost_benchmark_old import XGBoostBenchmark
94
65
95
-
In the `hpobenchrc`, you can specify for example the directory, in that the benchmark containers are
96
-
downloaded. We encourage you to take a look into the `hpobenchrc`, to find out more about all
All of HPOBench's settings are stored in a file, the `hpobenchrc`-file. It is a .yaml file, which is automatically generated at the first use of HPOBench.
95
+
By default, it is placed in `$XDG_CONFIG_HOME` (or if not set this defaults to `'~/.config/hpobench'`). This file defines where to store containers and datasets and much more. We highly recommend to have a look at this file once it's created. Furthermore, please make sure to have write permission in these directories or adapt if necessary. For more information on where data is stored, please see the section on `HPOBench Data` below.
96
+
97
+
Furthermore, for running containers, we rely on Unix sockets which by default are located in `$TEMP_DIR` (or if not set this defaults to `\tmp`).
98
+
119
99
### Remove all data, containers, and caches
120
100
121
-
Update: In version 0.0.8, we have added the script `hpobench/util/clean_up_script.py`. It allows to easily remove all
122
-
data, downloaded containers, and caches. To get more information, you can use the following command.
101
+
Feel free to use `hpobench/util/clean_up_script.py` to remove all data, downloaded containers and caches:
123
102
```bash
124
103
python ./hpobench/util/clean_up_script.py --help
125
104
```
126
105
127
-
If you like to delete only specific parts, i.e. a single container,
128
-
you can find the benchmark's data, container, and caches in the following directories:
106
+
If you like to delete only specific parts, i.e. a single container, you can find the benchmark's data, container, and caches in the following directories:
129
107
130
-
#### HPOBench data
108
+
#### HPOBench Data
131
109
HPOBench stores downloaded containers and datasets at the following locations:
0 commit comments