我正在尝试使用TensorFlow运行我的代码。
init = tf.global_variables_initializer()
loaded_graph = tf.Graph()
saver = tf.train.Saver()
with tf.Session(loaded_graph) as sess:
sess.run(init)
...
但我犯了个错误。
File "C:\Users\K451LN\My Documents\LiClipse Workspace\neuralnet\FFNN.py", line 68, in <module>
with tf.Session(loaded_graph) as sess:
AttributeError: 'Session' object has no attribute '_session'
tf.Graph()
有什么问题吗?
这是我的代码:
for i in range(num_networks):
print("Neural network: {0}".format(i))
X = tf.placeholder(tf.float32)
Y = tf.placeholder(tf.float32)
W1 = tf.Variable(tf.random_uniform([n_input, n_hidden], -1.0, 1.0), name = 'W1')
W2 = tf.Variable(tf.random_uniform([n_hidden, n_output], -1.0, 1.0), name = 'W2')
b1 = tf.Variable(tf.zeros([n_hidden]), name="Bias1")
b2 = tf.Variable(tf.zeros([n_output]), name="Bias2")
L2 = tf.sigmoid(tf.matmul(X, W1) + b1)
hy = tf.sigmoid(tf.matmul(L2, W2) + b2, name="op_to_restore")
cost = tf.reduce_mean(-Y*tf.log(hy) - (1-Y)*tf.log(1-hy))
optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)
init = tf.global_variables_initializer()
loaded_graph = tf.Graph()
saver = tf.train.Saver()
with tf.Session(loaded_graph) as sess:
sess.run(init)
...
我要添加这个tf.Graph()
来解决ValueError: At least two variables have the same name: Bias2
的错误。
发布于 2018-03-16 10:40:31
将loaded_graph传递给tf.Session()意味着您只能运行在该图中创建的操作。因为您所做的只是创建一个名为loaded_graph的图,但是不要向它添加任何内容,那么当您尝试执行sess.run( init )时会出现这个错误,因为init不在loaded_ graph图中。
我想您最初使用Bias2时出错的原因是for循环。如果删除for循环而不创建/传递loaded_graph,则不会出现任何错误。
如果您希望使用for循环,则可能需要在每个循环上创建一个新的图表。
g_1 = tf.Graph() with g_1.as_default(): ...
所以你的代码就像:
for i in range(num_networks):
g_1 = tf.Graph()
with g_1.as_default():
print("Neural network: {0}".format(i))
X = tf.placeholder(tf.float32)
Y = tf.placeholder(tf.float32)
W1 = tf.Variable(tf.random_uniform([n_input, n_hidden], -1.0, 1.0), name = 'W1')
W2 = tf.Variable(tf.random_uniform([n_hidden, n_output], -1.0, 1.0), name = 'W2')
b1 = tf.Variable(tf.zeros([n_hidden]), name="Bias1")
b2 = tf.Variable(tf.zeros([n_output]), name="Bias2")
L2 = tf.sigmoid(tf.matmul(X, W1) + b1)
hy = tf.sigmoid(tf.matmul(L2, W2) + b2, name="op_to_restore")
cost = tf.reduce_mean(-Y*tf.log(hy) - (1-Y)*tf.log(1-hy))
optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)
init = tf.global_variables_initializer()
saver = tf.train.Saver()
with tf.Session(graph=g_1) as sess:
sess.run(init)
...
https://stackoverflow.com/questions/49317371
复制相似问题