Index Keras tensors
The output layer of my Keras function model is a tensor x-dimension
(None, 1344, 2
<
).
.
Extracting n seems simple to simply access x[
:, :n,:] for continuous input, but (seemingly) difficult if n
indexes are noncontiguous. Is there a clean way to do Keras?
So far, this is my approach.
Lab 1 (slice tensor, continuous index, valid):
print('My tensor shape is', K.int_shape(x)) #my tensor
(None, 1344, 2) # as printed in my code
print('Slicing first 5 entries, shape is', K.int_shape(x[:, :5, :]))
(None, 5, 2) # as printed in my code, works!
Lab 2 (Indexing tensors at arbitrary indexes, failing).
print('My tensor shape is', K.int_shape(x)) #my tensor
(None, 1344, 2) # as printed in my code
foo = np.array([1,2,4,5,8])
print('arbitrary indexing, shape is', K.int_shape(x[:,foo,:]))
Keras returned the following error:
ValueError: Shapes must be equal rank, but are 1 and 0
From merging shape 1 with other shapes. for 'strided_slice_17/stack_1' (op:
'Pack') with input shapes: [], [5], [].
Lab 3 (TensorFlow backend functions)
I’ve also tried K.backend.gather, but its usage is unclear because 1) the Keras documentation states that the index should be an integer tensor and that there is no Keras equivalent to numpy.where
if my goal is to extract entries in x
that meet certain conditions and 2) K.backend.gather
It seems that the entry is extracted from
axis = 0
and I want to extract from the second dimension of x
Solution
You are looking for tf.gather_nd which will be indexed based on an array of indexes:
# From documentation
indices = [[0, 0], [1, 1]]
params = [['a', 'b'], ['c', 'd']]
output = ['a', 'd']
To use it in a
Keras model, make sure to wrap it in a layer, such as Lambda
.