Python – Binarized tensors in Keras

Binarized tensors in Keras… here is a solution to the problem.

Binarized tensors in Keras

I need to create a loss function for Keras that only works for binary values. To convert all values greater than 0.5 to 1.0, so I did this:

def MyLoss(y_true, y_pred:
    y_true_f = K.flatten(y_true)
    y_pred_f = K.flatten(K.cast(K.greater(y_pred, 0.5), 'float32'))
    #y_pred_f = K.flatten(K.cast(y_pred > 0.5), 'float32')
    #y_pred_f = K.flatten(y_pred > 0.5)
    return K.sum(y_true_f * y_pred_f)

The code compiles, but generates the following error later:

ValueError: None values not supported.

I also tried commenting lines, same error. If I don’t try to modify the value with y_pred_f = K.flatten(y_pred), it runs.

What am I doing wrong?

How do you binulate tensors?

Solution

The solution to binarize my logic-intensive layer is to create a custom lambda function in activation. (I’m working on a semantic hash autoencoder (Hinton).) Keras issued a warning, but it turned out it still worked. Early attempts throw errors due to the inability to distinguish wheel functions in the calculation of gradient derivatives in the backpropagation phase. (This is the old ValueError: None values are not supported.) The key here is to operate in activation rather than as a separate layer.

encoder_outputs = Dense(units=latent_vector_len, activation=k.layers.Lambda(lambda z: k.backend.round(k.layers.activations.sigmoid(x=z))), kernel_initializer=" lecun_normal")(x)

Real output that is typically in the range of 0 to 1 is converted to 0 and 1, as shown in the figure.

# Look it works!

y = encoder_model.predict(x=x_in)
print(y)
>>> [[1. 0. 0. 1. 0. 1. 0. 0.]]

In other words, it doesn’t work:

decoder_outputs_bin = k.layers.Lambda(lambda z: k.backend.round(z))(decoder_outputs) # ERR at training time ValueError: None values not supported.

Related Problems and Solutions