AttributeError: 'Tensor' object has no attribute '_keras_history'
Asked Answered
W

6

20

I looked for all the "'Tensor' object has no attribute ***" but none seems related to Keras (except for TensorFlow: AttributeError: 'Tensor' object has no attribute 'log10' which didn't help)...

I am making a sort of GAN (Generative Adversarial Networks). Here you can find the structure.

Layer (type)                     Output Shape          Param #         Connected to                     
_____________________________________________________________________________
input_1 (InputLayer)             (None, 30, 91)        0                                            
_____________________________________________________________________________
model_1 (Model)                  (None, 30, 1)         12558           input_1[0][0]                    
_____________________________________________________________________________
model_2 (Model)                  (None, 30, 91)        99889           input_1[0][0]                    
                                                                       model_1[1][0]                    
_____________________________________________________________________________
model_3 (Model)                  (None, 1)             456637          model_2[1][0]                    
_____________________________________________________________________________

I pretrained model_2, and model_3. The thing is I pretrained model_2 with list made of 0 and 1, but model_1 return approached values. So i considered rounding the model1_output, with the following code : the K.round() on model1_out.

import keras.backend as K
[...]
def make_gan(GAN_in, model1, model2, model3):
    model1_out = model1(GAN_in)
    model2_out = model2([GAN_in, K.round(model1_out)])
    GAN_out = model3(model2_out)
    GAN = Model(GAN_in, GAN_out)
    GAN.compile(loss=loss, optimizer=model1.optimizer, metrics=['binary_accuracy'])
    return GAN
[...]

I have the following error :

AttributeError: 'Tensor' object has no attribute '_keras_history'

Full traceback :

Traceback (most recent call last):
  File "C:\Users\Asmaa\Documents\BillyValuation\GFD.py", line 88, in <module>
GAN = make_gan(inputSentence, G, F, D)
  File "C:\Users\Asmaa\Documents\BillyValuation\GFD.py", line 61, in make_gan
GAN = Model(GAN_in, GAN_out)
  File "C:\ProgramData\Anaconda3\lib\site-packages\keras\legacy\interfaces.py", line 88, in wrapper
return func(*args, **kwargs)
  File "C:\ProgramData\Anaconda3\lib\site-packages\keras\engine\topology.py", line 1705, in __init__
build_map_of_graph(x, finished_nodes, nodes_in_progress)
  File "C:\ProgramData\Anaconda3\lib\site-packages\keras\engine\topology.py", line 1695, in build_map_of_graph
layer, node_index, tensor_index)
  File "C:\ProgramData\Anaconda3\lib\site-packages\keras\engine\topology.py", line 1695, in build_map_of_graph
layer, node_index, tensor_index)
  File "C:\ProgramData\Anaconda3\lib\site-packages\keras\engine\topology.py", line 1665, in build_map_of_graph
layer, node_index, tensor_index = tensor._keras_history
AttributeError: 'Tensor' object has no attribute '_keras_history'

I'm using Python 3.6, with Spyder 3.1.4, on Windows 7. I upgraded TensorFlow and Keras with pip last week. Thank you for any help provided !

Wickham answered 3/7, 2017 at 15:22 Comment(9)
Try putting K.round inside a Lambda layer. It's not usual to see operations outside layers in keras. (Not sure this is the problem, though).Lousy
@Daniel: with rounded = Lambda(lambda x: K.round(x))(G_out) and F_out = F([GAN_in, rounded]), I could "compile" but not "fit" anymore.Edam
So, what's the new error? (Lambda layers normally need an output_shape. In your case, the same shape of x)Lousy
@Daniel : Apologies, I didn't wanted to diverge. The new error is ValueError: Tried to convert 'x' to a tensor and failed. Error: None values not supported. I never used Lambdalayer before, I might have failed it. The exemple for the Keras Doc doesn't use any other parameters.Edam
Keep in mind that TensorFlow only supports version 3.5.x of Python on Windows: tensorflow.org/install/install_windowsNicolette
@MaëvaLC are you using + operator in your code, other than this appended code?Shaunna
@Media I don't (cf link to the code at this period) Since then, I used an alternative to my problem.Edam
I said that because + operator does not work properly and you have to use add method of keras inseadShaunna
@MaëvaLC were you able to solve the issue? I am stuck in the same error and I am not using any + to replace with Keras.ADD()Deicide
O
23

My problem was using '+' instead of 'Add' on keras

Olid answered 21/12, 2017 at 10:34 Comment(0)
S
13

Since the error comes directly from here:

Traceback (most recent call last):
  File "C:\Users\Asmaa\Documents\BillyValuation\GFD.py", line 88, in <module>
GAN = make_gan(inputSentence, G, F, D)
  File "C:\Users\Asmaa\Documents\BillyValuation\GFD.py", line 61, in make_gan
GAN = Model(GAN_in, GAN_out)

, and the inputs of your models depend on the outputs from previous models, I believe the bug lies in the codes in your model.

In you model code, please check line by line whether or not you apply a non-Keras operation, especially in the last few lines. For example ,for element-wise addition, you might intuitively use + or even numpy.add, but keras.layers.Add() should be used instead.

Sherwin answered 8/2, 2018 at 6:43 Comment(1)
I had the same issue and this solved it. I had used "+" previously in the code and it didn't throw an error until I tried to create the model. Replacing the "+" with keras' Add() fixed the issue.Toussaint
E
4

@'Maëva LC': I can't post a comment, this answers your None issue.

but the code is working fine without the line

model1_out = (lambda x: K.round(x), output_shape=...)(model1_out)

and not anything else. Anyway, thank you for trying.

Function round() is not differentiable, hence the gradient is None. I suggest you just remove the line.

Earthshine answered 11/7, 2017 at 3:48 Comment(0)
A
1

Try this:

def make_gan(GAN_in, model1, model2, model3):
    model1_out = model1(GAN_in)
    model1_out = Lambda(lambda x: K.round(x), output_shape=...)(model1_out)
    model2_out = model2([GAN_in, model1_out])
    GAN_out = model3(model2_out)
    GAN = Model(GAN_in, GAN_out)
    GAN.compile(loss=loss, optimizer=model1.optimizer, 
                metrics=['binary_accuracy'])
    return GAN
Anthocyanin answered 4/7, 2017 at 14:14 Comment(1)
but the code is working fine without the line model1_out = Lambda(lambda x: K.round(x), output_shape=...)(model1_out) and not touching anything else. Anyway, thank you for trying.Edam
S
0

This is supported in tensorflow versions 1.x You are using version 2.x probably.

%tensorflow_version 1.x
use the above tensorflow_version magic before importing tensorflow in google colab.

This is not valid in jupyter-notebook. Please do use Google Colab

Sarcasm answered 16/4, 2021 at 7:49 Comment(0)
G
0

I also have faced the same problem. When I use x = relu(x) then I got same error. Overcome this problem, I define a function and use Lambda layer.

def relu_func(x):
  
    return relu(x)

    x = layers.Lambda(relu_func)(x)
Goines answered 27/11, 2022 at 21:40 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.