Categories
Misc

stateful RNN

Hello!
I am currently working on a speech enhancement software. I created a model with a keras functional layer and put it in a RNN wrapper. Currently, it works on full soundfiles and everything is fine.

The states that i need the model/RNN layer to save are complex vectors:

[tf.zeros(shape=(batch_size, self.K), dtype=tf.complex64),
tf.zeros(shape=(batch_size, self.K), dtype=tf.complex64),
tf.zeros(shape=(batch_size, self.K), dtype=tf.complex64),
tf.ones(shape=(batch_size, self.K), dtype=tf.complex64)]

However, since I want to make this a real-time application I need the model to save some states that are important. Thats why I tried to make the RNN layer of my model stateful. But now I get some weir Error:

Traceback (most recent call last):

File “my python script”, line 104, in <module>

model_tmp.build_model()

File “my python script”, line 107, in build_model

pred_out = layer(inp)

File “…libsite-packageskeraslayersrecurrent.py“, line 679, in __call__

return super(RNN, self).__call__(inputs, **kwargs)

File “…libsite-packageskerasutilstraceback_utils.py”, line 67, in error_handler

raise e.with_traceback(filtered_tb) from None

File “…AppDataRoamingPythonPython39site-packagestensorflowpythonframeworkops.py“, line 1662, in convert_to_tensor

raise ValueError(

ValueError: Tensor conversion requested dtype float32 for Tensor with dtype complex64: <tf.Tensor: shape=(1, 1024), dtype=complex64, numpy=

array([[0.+0.j, 0.+0.j, 0.+0.j, …, 0.+0.j, 0.+0.j, 0.+0.j]],

dtype=complex64)>

So I debugged and “convert_to_tensor” is called with following:

value: tf.Tensor([[0.+0.j 0.+0.j 0.+0.j … 0.+0.j 0.+0.j 0.+0.j]], shape=(1, 1024), dtype=complex64)

dtype: <dtype: ‘float32’>

name: ‘initial_value’

I just dont know why it works when the RNN layer is not stateful and suddenly not. I hope you can somehow help me.

submitted by /u/QuantumProst
[visit reddit] [comments]

Leave a Reply

Your email address will not be published. Required fields are marked *