Conversation
This commit introduces name_scopes for the convolutional layers, for their weights and biases, and for the fully-connected layers, so that these items show up with meaningful names in TensorBoard. This commit also introduces scalar summaries for the accuracy and cost, which are calculated every 100th optimization. Additionally, this commit adds histogram summaries for the weights and biases, so that how these tensors change over time can be more easily seen. The default log directory has been set to "logs", which will create a new directory "logs" in the same directory as the 02_Convolutional_ Neural_Network.ipynb. To access TensorBoard, from a terminal, type: `tensorboard --log-dir=/path/to/02_CNN/folder/logs`
|
Thanks very much for this pull-request! I am flattered that you wanted to contribute to these tutorials! TensorBoard is actually on my list of interesting topics to cover, but I rarely ever use it myself, that's why I haven't gotten around to it. I believe one of the reasons that these tutorials are becoming popular is because I only cover a few closely related topics in each tutorial. The problem with the 'official' tutorials is that they cover 10 different things at the same time, so beginners are completely confused. So I think it is a bad idea to add TensorBoard to Tutorial #2 - especially because it is already quite complicated. What I would like to suggest, is that you start your own series of tutorials with topics that I haven't covered. I'll be happy to describe my procedure for recording and editing the video tutorials. It took me a long time to figure out how to do it (I probably spent a week installing and testing different video editors!) I hope this makes sense and thanks again for the PR! |
Maybe this makes more sense as a separate branch (since this would cause the notebook to diverge further from the notebook shown in the tutorial).
This merge introduces name_scopes for the convolutional layers, for their weights and biases, and for the fully-connected layers, so that these items show up with meaningful names in TensorBoard.
This merge also introduces scalar summaries for the accuracy and cost, which are calculated every 100th optimization, for display within TensorBoard.
Additionally, this merge adds histogram summaries for the weights and biases, so that how these tensors change over time can be more easily seen through TensorBoard.
The default log directory has been set to "logs", which will create a new directory "logs" in the same directory as the 02_Convolutional_Neural_Network.ipynb.
To access TensorBoard, from a terminal, type:
tensorboard --log-dir=/path/to/02_CNN/folder/logs