Useful or not, from you.
tensorflow [TF 2.0] tf.keras.optimizers.Adam

System information

  • TensorFlow version: 2.0.0-dev20190618
  • Python version: 3.6

Describe the current behavior I am trying to minimize a function using tf.keras.optimizers.Adam.minimize() and I am getting a TypeError

Describe the expected behavior First, in the TF 2.0 docs, it says the loss can be callable taking no arguments which returns the value to minimize. whereas the type error reads "'tensorflow.python.framework.ops.EagerTensor' object is not callable", which is not exactly the correct TypeError, it might be for some.

But the main issue is, I know I can do the same optimization by using GradientTape but I don't understand why I should or why the minimize() is not working. A similar issue I found related to this is linked: (https://github.com/tensorflow/tensorflow/issues/28068) and a stack overflow solution for how you can solve a similar problem using gradient tape for reference: (https://stackoverflow.com/questions/55060736/tensorflow-2-api-regression-tensorflow-python-framework-ops-eagertensor-object).

Code to reproduce the issue

import tensorflow as tf
import numpy as np

N  = 1000                               # Number of samples
n  = 4                                  # Dimension of the optimization variable
np.random.seed(0) 
X = tf.Variable(np.random.randn(n, 1))  # Variables will be tuned by the optimizer
C = tf.constant(np.random.randn(N, n))  # Constants will not be tuned by the optimizer
D = tf.constant(np.random.randn(N, 1))

def f_batch_tensorflow(x, A, B):
    e = tf.matmul(A, x) - B
    return tf.reduce_sum(tf.square(e))

fx = f_batch_tensorflow(X, C, D)
print(fx)

adam_opt = tf.keras.optimizers.Adam()
optimizer = adam_opt.minimize(fx, X)
print(optimizer)

Other info / logs Following is the error I am getting:

TypeError                                 Traceback (most recent call last)
<ipython-input-20-225de189a3ff> in <module>()
      7 
      8 adam_opt = tf.keras.optimizers.Adam()
----> 9 optimizer = adam_opt.minimize(fx, X)
     10 print(optimizer)

1 frames
/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/optimizer_v2/optimizer_v2.py in _compute_gradients(self, loss, var_list, grad_loss)
    347       if not callable(var_list):
    348         tape.watch(var_list)
--> 349       loss_value = loss()
    350     if callable(var_list):
    351       var_list = var_list()

TypeError: 'tensorflow.python.framework.ops.EagerTensor' object is not callable
That's a useful answer
Without any help

The function will have already been executed. You need to pass f_batch_tensorflow (or rather, a partial thereof) to minimize, not the result of executing it.

@dynamicwebpaige @tanzhenyu This is probably a common mistake, we should catch and rethrow (or type check the input) this specific error to explain what went wrong, maybe "EagerTensors cannot be passed to Optimizer.minimize. Please pass the callable representing the computation instead."