Where to add BatchNormalization function in Keras?|Batch Normalization Keras Example

Batch Normalization is just another layer, so you can use it as such to create your desired network architecture.

The general use case is to use BN between the linear and non-linear layers in your network, because it normalizes the input to your activation function, so that you’re centered in the linear section of the activation function (such as Sigmoid). There’s a small discussion of it here

In your case above, this might look like:


# import BatchNormalizationfrom keras.layers.normalization import BatchNormalization# instantiate modelmodel = Sequential()# we can think of this chunk as the input layermodel.add(Dense(64, input_dim=14, init='uniform'))model.add(BatchNormalization())model.add(Activation('tanh'))model.add(Dropout(0.5))# we can think of this chunk as the hidden layer    model.add(Dense(64, init='uniform'))model.add(BatchNormalization())model.add(Activation('tanh'))model.add(Dropout(0.5))# we can think of this chunk as the output layermodel.add(Dense(2, init='uniform'))model.add(BatchNormalization())model.add(Activation('softmax'))# setting up the optimization of our weights sgd = SGD(lr=0.1, decay=1e-6, momentum=0.9, nesterov=True)model.compile(loss='binary_crossentropy', optimizer=sgd)# running the fittingmodel.fit(X_train, y_train, nb_epoch=20, batch_size=16, show_accuracy=True, validation_split=0.2, verbose = 2)