Tensorflow serving, get different outcome -
i using tensorflow serving serve pre-trained model. strange thing when input same data model, got different outcome each time.
i thought might problem @ variable initialize, wondering there clue debug model, or how can find cause, thanks.
two common problems:
- there's known issue main_op in variables re-initialized random.
- you left dropout layers in prediction graph.
to address (1), use instead:
def main_op(): init_local = variables.local_variables_initializer() init_tables = lookup_ops.tables_initializer() return control_flow_ops.group(init_local, init_tables)
to address (2), sure aren't directly exporting training graph. need build new graph prediction/serving. if using tf.estimator
framework, conditionally add dropout layers when mode
tf.estimator.modekeys.train
.
Comments
Post a Comment