@@ -25,15 +25,15 @@ And Classic - create model from function with parameters.
25
25
26
26
First import constructor class, then create model constructor oject.
27
27
28
- ``` python
28
+ ```
29
29
from model_constructor.net import *
30
30
```
31
31
32
- ``` python
32
+ ```
33
33
model = Net()
34
34
```
35
35
36
- ``` python
36
+ ```
37
37
model
38
38
```
39
39
49
49
50
50
Now we have model consructor, default setting as xresnet18. And we can get model after call it.
51
51
52
- ``` python
52
+ ```
53
53
model.c_in
54
54
```
55
55
@@ -60,7 +60,7 @@ model.c_in
60
60
61
61
62
62
63
- ``` python
63
+ ```
64
64
model.c_out
65
65
```
66
66
@@ -71,7 +71,7 @@ model.c_out
71
71
72
72
73
73
74
- ``` python
74
+ ```
75
75
model.stem_sizes
76
76
```
77
77
@@ -82,7 +82,7 @@ model.stem_sizes
82
82
83
83
84
84
85
- ``` python
85
+ ```
86
86
model.layers
87
87
```
88
88
@@ -93,7 +93,7 @@ model.layers
93
93
94
94
95
95
96
- ``` python
96
+ ```
97
97
model.expansion
98
98
```
99
99
@@ -104,7 +104,7 @@ model.expansion
104
104
105
105
106
106
107
- ``` python
107
+ ```
108
108
%nbdev_collapse_output
109
109
model()
110
110
```
@@ -285,14 +285,14 @@ model()
285
285
If you want to change model, just change constructor parameters.
286
286
Lets create xresnet50.
287
287
288
- ``` python
288
+ ```
289
289
model.expansion = 4
290
290
model.layers = [3,4,6,3]
291
291
```
292
292
293
293
Now we can look at model body and if we call constructor - we have pytorch model!
294
294
295
- ``` python
295
+ ```
296
296
%nbdev_collapse_output
297
297
model.body
298
298
```
@@ -640,7 +640,7 @@ model.body
640
640
641
641
</details >
642
642
643
- ``` python
643
+ ```
644
644
model.block_szs
645
645
```
646
646
@@ -661,25 +661,25 @@ But now lets create model as mxresnet50 from fastai forums tread https://forums.
661
661
662
662
Lets create mxresnet constructor.
663
663
664
- ``` python
664
+ ```
665
665
model = Net(name='MxResNet')
666
666
```
667
667
668
668
Then lets modify stem.
669
669
670
- ``` python
670
+ ```
671
671
model.stem_sizes = [3,32,64,64]
672
672
```
673
673
674
674
Now lets change activation function to Mish.
675
675
Here is link to forum disscussion https://forums.fast.ai/t/meet-mish-new-activation-function-possible-successor-to-relu
676
676
Mish is in model_constructor.layer.
677
677
678
- ``` python
678
+ ```
679
679
model.act_fn = Mish()
680
680
```
681
681
682
- ``` python
682
+ ```
683
683
model
684
684
```
685
685
@@ -693,7 +693,7 @@ model
693
693
694
694
695
695
696
- ``` python
696
+ ```
697
697
%nbdev_collapse_output
698
698
model()
699
699
```
@@ -875,7 +875,7 @@ model()
875
875
876
876
Now lets make MxResNet50
877
877
878
- ``` python
878
+ ```
879
879
model.expansion = 4
880
880
model.layers = [3,4,6,3]
881
881
model.name = 'mxresnet50'
@@ -885,7 +885,7 @@ Now we have mxresnet50 constructor.
885
885
We can inspect every parts of it.
886
886
And after call it we got model.
887
887
888
- ``` python
888
+ ```
889
889
model
890
890
```
891
891
@@ -899,7 +899,7 @@ model
899
899
900
900
901
901
902
- ``` python
902
+ ```
903
903
%nbdev_collapse_output
904
904
model.stem.conv_1
905
905
```
@@ -919,7 +919,7 @@ model.stem.conv_1
919
919
920
920
</details >
921
921
922
- ``` python
922
+ ```
923
923
%nbdev_collapse_output
924
924
model.body.l_0.bl_0
925
925
```
@@ -961,17 +961,17 @@ model.body.l_0.bl_0
961
961
962
962
Now lets change Resblock to YaResBlock (Yet another ResNet, former NewResBlock) is in lib from version 0.1.0
963
963
964
- ``` python
964
+ ```
965
965
from model_constructor.yaresnet import YaResBlock
966
966
```
967
967
968
- ``` python
968
+ ```
969
969
model.block = YaResBlock
970
970
```
971
971
972
972
That all. Now we have YaResNet constructor
973
973
974
- ``` python
974
+ ```
975
975
model.name = 'YaResNet'
976
976
model
977
977
```
@@ -1146,7 +1146,7 @@ model
1146
1146
1147
1147
Let see what we have.
1148
1148
1149
- ``` python
1149
+ ```
1150
1150
%nbdev_collapse_output
1151
1151
model.body.l_1.bl_0
1152
1152
```
@@ -1189,43 +1189,43 @@ model.body.l_1.bl_0
1189
1189
1190
1190
Usual way to get model - call constructor with parametrs.
1191
1191
1192
- ``` python
1192
+ ```
1193
1193
from model_constructor.constructor import *
1194
1194
```
1195
1195
1196
1196
Default is resnet18.
1197
1197
1198
- ``` python
1198
+ ```
1199
1199
model = Net()
1200
1200
```
1201
1201
1202
1202
You cant modify model after call constructor, so define model with parameters.
1203
1203
For example, resnet34:
1204
1204
1205
- ``` python
1205
+ ```
1206
1206
resnet34 = Net(block=BasicBlock, blocks=[3, 4, 6, 3])
1207
1207
```
1208
1208
1209
1209
## Predefined Resnet models - 18, 34, 50.
1210
1210
1211
- ``` python
1211
+ ```
1212
1212
from model_constructor.resnet import *
1213
1213
```
1214
1214
1215
- ``` python
1215
+ ```
1216
1216
model = resnet34(num_classes=10)
1217
1217
```
1218
1218
1219
- ``` python
1219
+ ```
1220
1220
%nbdev_hide_output
1221
1221
model
1222
1222
```
1223
1223
1224
- ``` python
1224
+ ```
1225
1225
model = resnet50(num_classes=10)
1226
1226
```
1227
1227
1228
- ``` python
1228
+ ```
1229
1229
%nbdev_hide_output
1230
1230
model
1231
1231
```
@@ -1234,15 +1234,15 @@ model
1234
1234
1235
1235
This ie simplified version from fastai v1. I did refactoring for better understand and experiment with models. For example, it's very simple to change activation funtions, different stems, batchnorm and activation order etc. In v2 much powerfull realisation.
1236
1236
1237
- ``` python
1237
+ ```
1238
1238
from model_constructor.xresnet import *
1239
1239
```
1240
1240
1241
- ``` python
1241
+ ```
1242
1242
model = xresnet50()
1243
1243
```
1244
1244
1245
- ``` python
1245
+ ```
1246
1246
%nbdev_hide_output
1247
1247
model
1248
1248
```
@@ -1258,11 +1258,11 @@ Here is some examples:
1258
1258
1259
1259
Stem with 3 conv layers
1260
1260
1261
- ``` python
1261
+ ```
1262
1262
model = Net(stem=partial(Stem, stem_sizes=[32, 32]))
1263
1263
```
1264
1264
1265
- ``` python
1265
+ ```
1266
1266
%nbdev_collapse_output
1267
1267
model.stem
1268
1268
```
@@ -1296,11 +1296,11 @@ model.stem
1296
1296
1297
1297
</details >
1298
1298
1299
- ``` python
1299
+ ```
1300
1300
model = Net(stem_sizes=[32, 64])
1301
1301
```
1302
1302
1303
- ``` python
1303
+ ```
1304
1304
%nbdev_collapse_output
1305
1305
model.stem
1306
1306
```
@@ -1336,11 +1336,11 @@ model.stem
1336
1336
1337
1337
### Activation function before Normalization
1338
1338
1339
- ``` python
1339
+ ```
1340
1340
model = Net(bn_1st=False)
1341
1341
```
1342
1342
1343
- ``` python
1343
+ ```
1344
1344
model.stem
1345
1345
```
1346
1346
@@ -1362,15 +1362,15 @@ model.stem
1362
1362
1363
1363
### Change activation function
1364
1364
1365
- ``` python
1365
+ ```
1366
1366
new_act_fn = nn.LeakyReLU(inplace=True)
1367
1367
```
1368
1368
1369
- ``` python
1369
+ ```
1370
1370
model = Net(act_fn=new_act_fn)
1371
1371
```
1372
1372
1373
- ``` python
1373
+ ```
1374
1374
%nbdev_collapse_output
1375
1375
model.stem
1376
1376
```
@@ -1394,7 +1394,7 @@ model.stem
1394
1394
1395
1395
</details >
1396
1396
1397
- ``` python
1397
+ ```
1398
1398
%nbdev_collapse_output
1399
1399
model.body.layer_0.block_0
1400
1400
```
0 commit comments