Skip to content

Commit 1e7a882

Browse files
committed
Organize files
1 parent 44693ac commit 1e7a882

30 files changed

+242
-4
lines changed

BootstrapNAS/README.md

+52
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,52 @@
1+
# BootstrapNAS Jupyter Notebooks
2+
3+
---
4+
5+
<p align="center">
6+
<img src="architecture.png" alt="BootstrapNAS Architecture" width="500"/>
7+
</p>
8+
9+
BootstrapNAS (1) takes as input a pre-trained model. (2) It uses this model to generate a weight-sharing super-network. (3) BootstrapNAS then applies a training strategy, and once the super-network has been trained, (4) it searches for efficient subnetworks that satisfy the user's requirements. (5) The configuration of the discovered sub-network(s) is returned to the user.
10+
11+
## Quickstart
12+
13+
Please follow the instructions [here](https://github.com/jpablomch/bootstrapnas/wiki/Quickstart).
14+
15+
If you already have a super-network trained with BootstrapNAS, please follow the instructions to search for sub-networks [here](https://github.com/jpablomch/bootstrapnas/wiki/Subnetwork_Search).
16+
17+
More information about BootstrapNAS is available in our papers:
18+
19+
[Automated Super-Network Generation for Scalable Neural Architecture Search](https://openreview.net/attachment?id=HK-zmbTB8gq&name=main_paper_and_supplementary_material).
20+
21+
```bibtex
22+
@inproceedings{
23+
munoz2022automated,
24+
title={Automated Super-Network Generation for Scalable Neural Architecture Search},
25+
author={Mu{\~{n}}oz, J. Pablo and Lyalyushkin, Nikolay and Lacewell, Chaunte and Senina, Anastasia and Cummings, Daniel and Sarah, Anthony and Kozlov, Alexander and Jain, Nilesh},
26+
booktitle={First Conference on Automated Machine Learning (Main Track)},
27+
year={2022},
28+
url={https://openreview.net/forum?id=HK-zmbTB8gq}
29+
}
30+
```
31+
[Enabling NAS with Automated Super-Network Generation](https://arxiv.org/abs/2112.10878)
32+
33+
```BibTex
34+
@article{
35+
bootstrapNAS,
36+
author = {Mu{\~{n}}oz, J. Pablo and Lyalyushkin, Nikolay and Akhauri, Yash and Senina, Anastasia and Kozlov, Alexander and Jain, Nilesh},
37+
title = {Enabling NAS with Automated Super-Network Generation},
38+
journal = {CoRR},
39+
volume = {abs/2112.10878},
40+
year = {2021},
41+
url = {https://arxiv.org/abs/2112.10878},
42+
eprinttype = {arXiv},
43+
eprint = {2112.10878},
44+
timestamp = {Tue, 04 Jan 2022 15:59:27 +0100},
45+
biburl = {https://dblp.org/rec/journals/corr/abs-2112-10878.bib},
46+
bibsource = {dblp computer science bibliography, https://dblp.org}
47+
}
48+
```
49+
50+
## Contributing to BootstrapNAS
51+
Please follow the contribution guidelines in [NNCF](https://github.com/openvinotoolkit/nncf) and reach out to [Pablo Muñoz](mailto:[email protected]) if you have any questions.
52+

BootstrapNAS/architecture.png

314 KB
Loading
File renamed without changes.
File renamed without changes.

examples/bootstrapNAS/bootstrapnas_utils.py renamed to BootstrapNAS/examples/bootstrapnas_utils.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -552,7 +552,7 @@ def resnet50_cifar10(pretrained=False, progress=True, device="cpu", **kwargs):
552552
def create_folders_demo(base_model_name):
553553
from pathlib import Path
554554
# MODEL_DIR = Path("model")
555-
MODEL_DIR = Path("../../models/pretrained")
555+
MODEL_DIR = Path("../models/pretrained")
556556
OUTPUT_DIR = Path("output")
557557
DATA_DIR = Path("data")
558558
BASE_MODEL_NAME = base_model_name
+84
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,84 @@
1+
### Configuration file
2+
3+
The parameters for generating, training and searching on the super-network are defined in a configuration file within two exclusive subsets of parameters for training and search:
4+
```json
5+
"bootstrapNAS": {
6+
"training": {
7+
...
8+
},
9+
"search": {
10+
...
11+
}
12+
}
13+
```
14+
15+
In the `training` section, you specify the training algorithm, e.g., `progressive_shrinking`, schedule and elasticity parameters:
16+
17+
```json
18+
"training": {
19+
"algorithm": "progressive_shrinking",
20+
"progressivity_of_elasticity": ["depth", "width"],
21+
"batchnorm_adaptation": {
22+
"num_bn_adaptation_samples": 1500
23+
},
24+
"schedule": {
25+
"list_stage_descriptions": [
26+
{"train_dims": ["depth"], "epochs": 25, "depth_indicator": 1, "init_lr": 2.5e-6, "epochs_lr": 25},
27+
{"train_dims": ["depth"], "epochs": 40, "depth_indicator": 2, "init_lr": 2.5e-6, "epochs_lr": 40},
28+
{"train_dims": ["depth", "width"], "epochs": 50, "depth_indicator": 2, "reorg_weights": true, "width_indicator": 2, "bn_adapt": true, "init_lr": 2.5e-6, "epochs_lr": 50},
29+
{"train_dims": ["depth", "width"], "epochs": 50, "depth_indicator": 2, "reorg_weights": true, "width_indicator": 3, "bn_adapt": true, "init_lr": 2.5e-6, "epochs_lr": 50}
30+
]
31+
},
32+
"elasticity": {
33+
"available_elasticity_dims": ["width", "depth"],
34+
"width": {
35+
"max_num_widths": 3,
36+
"min_width": 32,
37+
"width_step": 32,
38+
"width_multipliers": [1, 0.80, 0.60]
39+
},
40+
...
41+
}
42+
43+
```
44+
In the search section, you specify the search algorithm, e.g., `NSGA-II` and its parameters. For example:
45+
```json
46+
"search": {
47+
"algorithm": "NSGA2",
48+
"num_evals": 3000,
49+
"population": 50,
50+
"ref_acc": 93.65,
51+
}
52+
```
53+
54+
By default, BootstrapNAS uses `NSGA-II` (Dev et al., 2002), an genetic algorithm that constructs a pareto front of efficient sub-networks.
55+
56+
List of parameters that can be used in the configuration file:
57+
58+
**Training:**
59+
60+
`algorithm`: Defines training strategy for tuning supernet. By default, `progressive_shrinking`.
61+
62+
`progressivity_of_elasticity`: Defines the order of adding a new elasticity dimension from stage to stage. examples=["width", "depth", "kernel"].
63+
64+
`batchnorm_adaptation`: Specifies the number of samples from the training dataset to use for model inference during the BatchNorm statistics adaptation procedure for the compressed model.
65+
66+
`schedule`: The schedule section includes a list of stage descriptors (`list_stage_descriptions`) that specify the elasticity dimensions enabled for a particular stage (`train_dims`), the number of `epochs` for the stage, the `depth_indicator` which in the case of elastic depth, restricts the maximum number of blocks in each independent group that can be skipped, the `width_indicator`, which restricts the maximum number of width values in each elastic layer. The user can also specify whether weights should be reorganized (`reorg_weights`), whether batch norm adaptation should be triggered at the beginning of the stage (`bn_adapt`), the initial learning rate for the stage (`init_lr`), and the epochs to use for adjusting the learning rate (`epochs_lr`).
67+
68+
`elasticity`: Currently, BootstrapNAS supports three elastic dimensions (`kernel`, `width` and `depth`). The `mode` for elastic depth can be set as `auto` or `manual`. If manual is selected, the user can specify, a list of possible `skipped_blocks` that, as the name suggest, might be skipped. In `auto` mode, the user can specify the `min_block_size`, i.e., minimal number of operations in the skipping block, and the `max_block_size`, i.e., maximal number of operations in the block. The user can also `allow_nested_blocks` or `allow_linear_combination` of blocks. In the case of elastic width, the user can specify the `min_width`, i.e., the minimal number of output channels that can be activated for each layers with elastic width. Default value is 32, the `max_num_widths`, which restricts total number of different elastic width values for each layer, a `width_step`, which defines a step size for a generation of the elastic width search space, or a `width_multiplier` to define the elastic width search space via a list of multipliers. Finally, the user can determine the type of filter importance metric: L1, L2 or geometric mean. L2 is selected by default. For elastic kernel, the user can specify the `max_num_kernels`, which restricts the total number of different elastic kernel values for each layer.
69+
70+
`train_steps`: Defines the number of samples used for each training epoch.
71+
72+
**Search:**
73+
74+
`algorithm`: Defines the search algorithm. The default algorithm is NSGA-II.
75+
76+
`num_evals`: Defines the number of evaluations that will be used by the search algorithm.
77+
78+
`population`: Defines the population size when using an evolutionary search algorithm.
79+
80+
`acc_delta`: Defines the absolute difference in accuracy that is tolerated when looking for a subnetwork.
81+
82+
`ref_acc`: Defines the reference accuracy from the pre-trained model used to generate the super-network.
83+
84+
*A full list of the possible configuration parameters can be found [here](https://github.com/jpablomch/nncf_bootstrapnas/blob/develop/nncf/config/experimental_schema.py).

BootstrapNAS/instructions/Home.md

+10
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
### BootstrapNAS: Automated Super-Network Generation for Scalable Neural Architecture Search
2+
3+
### [Quickstart](https://github.com/jpablomch/bootstrapnas/wiki/Quickstart)
4+
5+
### [Sub-network Search](https://github.com/jpablomch/bootstrapnas/wiki/Subnetwork_Search)
6+
7+
### [Configuration](https://github.com/jpablomch/bootstrapnas/wiki/Configuration)
8+
9+
More detailed guides coming soon.
10+
+70
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,70 @@
1+
# Setup
2+
3+
### PyTorch
4+
Install PyTorch and Torchvision using the [PyTorch installation guide](https://pytorch.org/get-started/locally/#start-locally). NNCF currently supports PyTorch >=1.5.0, <=1.9.1 (1.8.0 not supported). For this quickstart, PyTorch 1.9.1 and Torchvision 0.10.1 with CUDA 11.1 was installed using:
5+
```bash
6+
pip install torch==1.9.1+cu111 torchvision==0.10.1+cu111 -f https://download.pytorch.org/whl/torch_stable.html
7+
```
8+
9+
10+
### NNCF
11+
There are two options for installing [***NNCF***](https://github.com/openvinotoolkit/nncf#installation):
12+
- Package built from NNCF repository or
13+
- PyPI package.
14+
15+
To install NNCF and dependencies from the NNCF repository, install by running the following in the repository root directory and also set `PYTHONPATH` variable to include the root directory:
16+
```bash
17+
python setup.py develop
18+
export PYTHONPATH="${PYTHONPATH}:<Directory Path>/nncf"
19+
```
20+
21+
To install NNCF and dependencies as a PyPI package, use the following:
22+
```bash
23+
pip install nncf
24+
```
25+
26+
The ```examples``` folder from the NNCF repository ***is not*** included when you install NNCF using a package manager. To run the BootstrapNAS examples, you will need to obtain this folder from the repository and add it to your path.
27+
28+
29+
### Additional Dependencies
30+
The examples in the NNCF repo have additional requirements, such as EfficientNet, MLFlow, Tensorboard, etc., which are not installed with NNCF. You will need to install them using:
31+
```
32+
pip install efficientnet_pytorch tensorboard mlflow returns
33+
```
34+
35+
36+
# Example
37+
To run an example of super-network generation and sub-network search, use the ```bootstrap_nas.py``` script located [here](https://github.com/openvinotoolkit/nncf/blob/develop/examples/experimental/torch/classification/bootstrap_nas.py) and the sample ```config.json``` from [here](https://github.com/jpablomch/bootstrapnas/blob/main/bootstrapnas_examples/config.json).
38+
39+
The file ```config.json``` contains a sample configuration for generating a super-network from a trained model. The sample file is configured to generate a super-network from ResNet-50 trained with CIFAR-10. The file should be modified depending on the model to be used as input for BootstrapNAS.
40+
41+
Weights for CIFAR10-based models can be found at: https://github.com/huyvnphan/PyTorch_CIFAR10
42+
43+
Use the following to test training a super-network:
44+
```
45+
cd <path to NNCF>/examples/experimental/torch/classification
46+
python bootstrap_nas.py -m train \
47+
-c <path to this repo>/bootstrapnas_examples/config.json \
48+
--data <path to your CIFAR10 dataset> \
49+
--weights <path to weights for resnet-50 trained with CIFAR10>
50+
```
51+
52+
53+
### Expected Output Files after executing BootstrapNAS
54+
The output of running ```bootstrap_nas.py``` will be a sub-network configuration that has an accuracy similar to the input model (by default a $\pm$1% absolute difference in accuracy is allowed), but with improvements in MACs. Format: ([MACs_subnet, ACC_subnet]).
55+
56+
Several files are saved to your `log_dir` after the training has ended:
57+
58+
- `compressed_graph.{dot, png}`- Dot and PNG files that describe the wrapped NNCF model.
59+
- `original_graph.dot` - Dot file that describes the original model.
60+
- `config.json`- A copy of your original config file.
61+
- `events.*`- Tensorboard logs.
62+
- `last_elasticity.pth`- Super-network's elasticity information. This file can be used when loading super-networks for searching or inspection.
63+
- `last_model_weights.pth`- Super-network's weights after training.
64+
- `snapshot.tar.gz` - Copy of the code used for this run.
65+
- `subnetwork_best.pth` - Dictionary with the configuration of the best sub-network. Best defined as a sub-network that performs in the Pareto front, and that deviates a maximum `acc_delta` from original model.
66+
- `supernet_{best, last}.pth` - Super-network weights at its best and last state.
67+
68+
If the user wants to have a CSV output file of the search progression, ```search_algo.search_progression_to_csv()``` can be called after running the search step.
69+
70+
For a visualization of the search progression please use ```search_algo.visualize_search_progression()``` after the search has concluded. A PNG file will be generated.
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
### Search on an existing super-network
2+
3+
If you have a trained super-network, you can start the search stage directly using the ```bootstrap_nas_search.py``` script located [here](https://github.com/openvinotoolkit/nncf/blob/develop/examples/experimental/torch/classification/bootstrap_nas_search.py).
4+
5+
You will need to pass the path where the weights and elasticity information has been stored, which by default is your log directory.
6+
7+
```shell
8+
python bootstrap_nas_search.py -m
9+
train
10+
--config <Config path to your config.json used when training the super-network>
11+
--log-dir <Path to your log dir for the search stage>
12+
--dataset
13+
<cifar10, imagenet, or other depending on your model>
14+
--data <Path to your dataset>
15+
--elasticity-state-path
16+
<Path to your last_elasticity.pth file generated when training of the super-network>
17+
--supernet-weights <Path to your last_model_weights.pth generated during training of the super-network>
18+
--search-mode
19+
```
20+
21+
#### Hardware-aware search
22+
23+
BootstrapNAS can be made hardware-aware when searching for efficient sub-networks. To accomplish this, you can pass your own `eficiency evaluator` for the target hardware to the search component.
File renamed without changes.

README.md

+2-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
# Hardware-Aware Automated Machine Learning Research
22

3-
Auxiliary resources for using AutoX - BootstrapNAS solution in the Neural Network Compression Framework (NNCF).
3+
### [NNCF's BootstrapNAS - Notebooks and Examples](./BootstrapNAS/README.md)
4+
45

56

67

examples/README.md

-2
This file was deleted.

0 commit comments

Comments
 (0)