Skip to content

Commit f79cc92

Browse files
authored
FIX: move the BatchNormalization before the activation with no bias (#531)
1 parent b62143d commit f79cc92

File tree

2 files changed

+16
-7
lines changed

2 files changed

+16
-7
lines changed

doc/whats_new/v0.5.rst

+9
Original file line numberDiff line numberDiff line change
@@ -22,3 +22,12 @@ Maintenance
2222

2323
- Make it possible to ``import imblearn`` and access submodule.
2424
:issue:`500` by :user:`Guillaume Lemaitre <glemaitre>`.
25+
26+
Bug
27+
...
28+
29+
- Fix wrong usage of :class:`keras.layers.BatchNormalization` in
30+
``porto_seguro_keras_under_sampling.py`` example. The batch normalization
31+
was moved before the activation function and the bias was removed from the
32+
dense layer.
33+
:issue:`531` by :user:`Guillaume Lemaitre <glemaitre>`.

examples/applications/porto_seguro_keras_under_sampling.py

+7-7
Original file line numberDiff line numberDiff line change
@@ -98,20 +98,20 @@ def make_model(n_features):
9898
model = Sequential()
9999
model.add(Dense(200, input_shape=(n_features,),
100100
kernel_initializer='glorot_normal'))
101-
model.add(Activation('relu'))
102101
model.add(BatchNormalization())
103-
model.add(Dropout(0.5))
104-
model.add(Dense(100, kernel_initializer='glorot_normal'))
105102
model.add(Activation('relu'))
103+
model.add(Dropout(0.5))
104+
model.add(Dense(100, kernel_initializer='glorot_normal', use_bias=False))
106105
model.add(BatchNormalization())
107-
model.add(Dropout(0.25))
108-
model.add(Dense(50, kernel_initializer='glorot_normal'))
109106
model.add(Activation('relu'))
107+
model.add(Dropout(0.25))
108+
model.add(Dense(50, kernel_initializer='glorot_normal', use_bias=False))
110109
model.add(BatchNormalization())
111-
model.add(Dropout(0.15))
112-
model.add(Dense(25, kernel_initializer='glorot_normal'))
113110
model.add(Activation('relu'))
111+
model.add(Dropout(0.15))
112+
model.add(Dense(25, kernel_initializer='glorot_normal', use_bias=False))
114113
model.add(BatchNormalization())
114+
model.add(Activation('relu'))
115115
model.add(Dropout(0.1))
116116
model.add(Dense(1, activation='sigmoid'))
117117

0 commit comments

Comments
 (0)