@@ -25,15 +25,15 @@ And by creating constructor object, then modify it and then create model.
25
25
26
26
First import constructor class, then create model constructor oject.
27
27
28
- ```
28
+ ``` python
29
29
from model_constructor.net import *
30
30
```
31
31
32
- ```
32
+ ``` python
33
33
model = Net()
34
34
```
35
35
36
- ```
36
+ ``` python
37
37
model
38
38
```
39
39
46
46
47
47
Now we have model consructor, defoult setting as xresnet18. And we can get model after call it.
48
48
49
- ```
49
+ ``` python
50
50
model.c_in
51
51
```
52
52
@@ -57,7 +57,7 @@ model.c_in
57
57
58
58
59
59
60
- ```
60
+ ``` python
61
61
model.c_out
62
62
```
63
63
@@ -68,7 +68,7 @@ model.c_out
68
68
69
69
70
70
71
- ```
71
+ ``` python
72
72
model.stem_sizes
73
73
```
74
74
@@ -79,7 +79,7 @@ model.stem_sizes
79
79
80
80
81
81
82
- ```
82
+ ``` python
83
83
model.layers
84
84
```
85
85
@@ -90,7 +90,7 @@ model.layers
90
90
91
91
92
92
93
- ```
93
+ ``` python
94
94
model.expansion
95
95
```
96
96
@@ -101,7 +101,7 @@ model.expansion
101
101
102
102
103
103
104
- ```
104
+ ``` python
105
105
model()
106
106
```
107
107
@@ -277,14 +277,14 @@ model()
277
277
If you want to change model, just change constructor parameters.
278
278
Lets create xresnet50.
279
279
280
- ```
280
+ ``` python
281
281
model.expansion = 4
282
282
model.layers = [3 ,4 ,6 ,3 ]
283
283
```
284
284
285
285
Now we can look at model body and if we call constructor - we have pytorch model!
286
286
287
- ```
287
+ ``` python
288
288
model.body
289
289
```
290
290
@@ -623,7 +623,7 @@ model.body
623
623
624
624
625
625
626
- ```
626
+ ``` python
627
627
model.block_szs
628
628
```
629
629
@@ -644,20 +644,20 @@ But now lets create model as mxresnet50 from fastai forums tread https://forums.
644
644
645
645
Lets create mxresnet constructor.
646
646
647
- ```
647
+ ``` python
648
648
mxresnet = Net()
649
649
```
650
650
651
651
Then lets modify stem.
652
652
653
- ```
653
+ ``` python
654
654
mxresnet.stem_sizes = [3 ,32 ,64 ,64 ]
655
655
```
656
656
657
657
Now lets change activation function to Mish.
658
658
Here is link to forum disscussion https://forums.fast.ai/t/meet-mish-new-activation-function-possible-successor-to-relu
659
659
660
- ```
660
+ ``` python
661
661
class Mish (nn .Module ):
662
662
def __init__ (self ):
663
663
super ().__init__ ()
@@ -666,7 +666,7 @@ class Mish(nn.Module):
666
666
return x * ( torch.tanh(F.softplus(x)))
667
667
```
668
668
669
- ```
669
+ ``` python
670
670
mxresnet.expansion = 4
671
671
mxresnet.layers = [3 ,4 ,6 ,3 ]
672
672
mxresnet.act_fn = Mish()
@@ -677,7 +677,7 @@ Now we have mxresnet50 constructor.
677
677
We can inspect some parts of it.
678
678
And after call it we got model.
679
679
680
- ```
680
+ ``` python
681
681
mxresnet
682
682
```
683
683
@@ -688,7 +688,7 @@ mxresnet
688
688
689
689
690
690
691
- ```
691
+ ``` python
692
692
mxresnet.stem.conv_1
693
693
```
694
694
@@ -703,7 +703,7 @@ mxresnet.stem.conv_1
703
703
704
704
705
705
706
- ```
706
+ ``` python
707
707
mxresnet.body.l_0.bl_0
708
708
```
709
709
@@ -738,13 +738,13 @@ mxresnet.body.l_0.bl_0
738
738
739
739
Now lets change Resblock. NewResBlock (stiil not own name yet) is in lib from version 0.1.0
740
740
741
- ```
741
+ ``` python
742
742
mxresnet.block = NewResBlock
743
743
```
744
744
745
745
That all. Let see what we have.
746
746
747
- ```
747
+ ``` python
748
748
mxresnet.body.l_1.bl_0
749
749
```
750
750
@@ -782,46 +782,46 @@ mxresnet.body.l_1.bl_0
782
782
783
783
Usual way to get model - call constructor with parametrs.
784
784
785
- ```
785
+ ``` python
786
786
from model_constructor.constructor import *
787
787
```
788
788
789
789
Default is resnet18.
790
790
791
- ```
791
+ ``` python
792
792
model = Net()
793
793
```
794
794
795
795
You cant modify model after call constructor, so define model with parameters.
796
796
For example, resnet34:
797
797
798
- ```
798
+ ``` python
799
799
resnet34 = Net(block = BasicBlock, blocks = [3 , 4 , 6 , 3 ])
800
800
```
801
801
802
802
## Predefined Resnet models - 18, 34, 50.
803
803
804
- ```
804
+ ``` python
805
805
from model_constructor.resnet import *
806
806
```
807
807
808
- ```
808
+ ``` python
809
809
model = resnet34(num_classes = 10 )
810
810
```
811
811
812
- ```
812
+ ``` python
813
813
model = resnet50(num_classes = 10 )
814
814
```
815
815
816
816
## Predefined Xresnet from fastai 1.
817
817
818
818
This ie simplified version from fastai v1. I did refactoring for better understand and experiment with models. For example, it's very simple to change activation funtions, different stems, batchnorm and activation order etc. In v2 much powerfull realisation.
819
819
820
- ```
820
+ ``` python
821
821
from model_constructor.xresnet import *
822
822
```
823
823
824
- ```
824
+ ``` python
825
825
model = xresnet50()
826
826
```
827
827
@@ -836,11 +836,11 @@ Here is some examples:
836
836
837
837
Stem with 3 conv layers
838
838
839
- ```
839
+ ``` python
840
840
model = Net(stem = partial(Stem, stem_sizes = [32 , 32 ]))
841
841
```
842
842
843
- ```
843
+ ``` python
844
844
model.stem
845
845
```
846
846
@@ -869,11 +869,11 @@ model.stem
869
869
870
870
871
871
872
- ```
872
+ ``` python
873
873
model = Net(stem_sizes = [32 , 64 ])
874
874
```
875
875
876
- ```
876
+ ``` python
877
877
model.stem
878
878
```
879
879
@@ -904,11 +904,11 @@ model.stem
904
904
905
905
### Activation function before Normalization
906
906
907
- ```
907
+ ``` python
908
908
model = Net(bn_1st = False )
909
909
```
910
910
911
- ```
911
+ ``` python
912
912
model.stem
913
913
```
914
914
@@ -930,15 +930,15 @@ model.stem
930
930
931
931
### Change activation function
932
932
933
- ```
933
+ ``` python
934
934
new_act_fn = nn.LeakyReLU(inplace = True )
935
935
```
936
936
937
- ```
937
+ ``` python
938
938
model = Net(act_fn = new_act_fn)
939
939
```
940
940
941
- ```
941
+ ``` python
942
942
model.stem
943
943
```
944
944
@@ -957,7 +957,7 @@ model.stem
957
957
958
958
959
959
960
- ```
960
+ ``` python
961
961
model.body.layer_0.block_0
962
962
```
963
963
0 commit comments