python - Function approximation with tensorflow, sigmoid vs relu6 -


im trying aproximate sin() function (actually can aproximate anything) tensorflow in deep neural network, has 2 layers 10 , 5 neurons each, tried many optimizers , adam seems best 1 (also found paper recommending it)

my problem if use relu6 activation function wich recommended aproximation looks https://imgur.com/a/pcsre, has lot of edges. in other hand if use sigmoid aproximation more soft less neurons looks https://imgur.com/a/gevss , don't know why fails that.

i mapped function interval [0, 1] before feeding train step.

any light on appreciated, in advance!


Comments

Popular posts from this blog

resizing Telegram inline keyboard -

command line - How can a Python program background itself? -

php - "cURL error 28: Resolving timed out" on Wordpress on Azure App Service on Linux -