Skip to content

Commit a95ffe7

Browse files
authored
Merge pull request #257 from Matanley/patch-1
Fix typo
2 parents 01c2b82 + 6df85a5 commit a95ffe7

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

batch-norm/Batch_Normalization_Lesson.ipynb

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -125,7 +125,7 @@
125125
" followed by a hidden layer with 100 nodes, followed by an output layer with 10 nodes.\n",
126126
" :param activation_fn: Callable\n",
127127
" The function used for the output of each hidden layer. The network will use the same\n",
128-
" activation function on every hidden layer and no activate function on the output layer.\n",
128+
" activation function on every hidden layer and no activation function on the output layer.\n",
129129
" e.g. Pass tf.nn.relu to use ReLU activations on your hidden layers.\n",
130130
" :param use_batch_norm: bool\n",
131131
" Pass True to create a network that uses batch normalization; False otherwise\n",
@@ -278,7 +278,7 @@
278278
" if False, perform inference with batch normalization using estimated population mean and variance.\n",
279279
" Note: in real life, *always* perform inference using the population mean and variance.\n",
280280
" This parameter exists just to support demonstrating what happens if you don't.\n",
281-
" :param include_individual_predictions: bool (default True)\n",
281+
" :param include_individual_predictions: bool (default False)\n",
282282
" This function always performs an accuracy test against the entire test set. But if this parameter\n",
283283
" is True, it performs an extra test, doing 200 predictions one at a time, and displays the results\n",
284284
" and accuracy.\n",
@@ -1806,8 +1806,8 @@
18061806
" train_mean = tf.assign(pop_mean, pop_mean * decay + batch_mean * (1 - decay))\n",
18071807
" train_variance = tf.assign(pop_variance, pop_variance * decay + batch_variance * (1 - decay))\n",
18081808
"\n",
1809-
" # The 'tf.control_dependencies' context tells TensorFlow it must calculate 'train_mean' \n",
1810-
" # and 'train_variance' before it calculates the 'tf.nn.batch_normalization' layer.\n",
1809+
" # The `tf.control_dependencies` context tells TensorFlow it must calculate `train_mean` \n",
1810+
" # and `train_variance` before it calculates the `tf.nn.batch_normalization` layer.\n",
18111811
" # This is necessary because the those two operations are not actually in the graph\n",
18121812
" # connecting the linear_output and batch_normalization layers, \n",
18131813
" # so TensorFlow would otherwise just skip them.\n",

0 commit comments

Comments
 (0)