Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Select an option

  • Save karino2/b489a3795588965a33555e26d0cf16c6 to your computer and use it in GitHub Desktop.

Select an option

Save karino2/b489a3795588965a33555e26d0cf16c6 to your computer and use it in GitHub Desktop.
{"cells":[{"cell_type":"markdown","metadata":{},"source":["https://arxiv.org/abs/1701.00160\n\nこのペーパーには、7章に練習問題が付いている。\nその1問目から分からない、という話。"]},{"cell_type":"markdown","metadata":{},"source":["# 7.1 The optimal discriminator strategy\n\n$J^{(D)}(\\theta^{(D)}, \\theta^{(G)}) = 1/2 E_{x~p_{data}} log D(x) - 1/2E_x log(1-D(G(z)))$\n\nを$\\theta^{(D)}$に関して最小化したい、というのが一般的な問題だが、ここでさらに、D(x)がxと独立に自由に決められる場合を考える。\nこの時、最適なDは何になるか?"]},{"cell_type":"markdown","metadata":{},"source":["# 8.1ペーパー内にある略式回答(が分からない)\n\n8.1に回答があって、JをDで汎関数微分してイコールゼロで解くと\n\n$ D^*(x) = \\frac{p_{data}(x)}{p_{data}(x) + p_{model}(x)}$\n\nとなる、と書いてある。でも期待値の汎関数微分とか良く分からない。"]},{"cell_type":"markdown","metadata":{},"source":["# 試行錯誤いろいろ"]}],"metadata":{"kernelspec":{"display_name":"Python 2","language":"python","name":"python2"},"lanbuage_info":{"codemirror_mode":{"name":"ipython","version":2},"file_extension":".py","mimetype":"text/x-python","name":"python","nbconvert_exporter":"python","pygments_lexer":"ipython2","version":"2.7.11"}},"nbformat":4,"nbformat_minor":0}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment