Tensorflow serving, get different outcome -


i using tensorflow serving serve pre-trained model. strange thing when input same data model, got different outcome each time.

i thought might problem @ variable initialize, wondering there clue debug model, or how can find cause, thanks.

two common problems:

  1. there's known issue main_op in variables re-initialized random.
  2. you left dropout layers in prediction graph.

to address (1), use instead:

def main_op():   init_local = variables.local_variables_initializer()   init_tables = lookup_ops.tables_initializer()   return control_flow_ops.group(init_local, init_tables) 

to address (2), sure aren't directly exporting training graph. need build new graph prediction/serving. if using tf.estimator framework, conditionally add dropout layers when mode tf.estimator.modekeys.train.


Comments

Popular posts from this blog

resizing Telegram inline keyboard -

command line - How can a Python program background itself? -

php - "cURL error 28: Resolving timed out" on Wordpress on Azure App Service on Linux -