You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Ensure all tutorials use the recommended Generator API for random number generation (numpy#71)
* Use new random api to sample digits for display.
* Change references in text to new random API (w/ links).
* Use new random API in training/eval loop.
* Use new random API in pong tutorial.
* grab mnist data from github mirror.
for sample, ax in zip(rng.choice(x_train, size=num_examples, replace=False), axes):
165
+
ax.imshow(sample.reshape(28, 28), cmap='gray')
162
166
```
163
167
164
168
> **Note:** You can also visualize a sample image as an array by printing `x_train[59999]`. Here, `59999` is your 60,000th training image sample (`0` would be your first). Your output will be quite long and should contain an array of 8-bit integers:
@@ -308,7 +312,7 @@ Afterwards, you will construct the building blocks of a simple deep learning mod
308
312
309
313
> **Note:** For simplicity, the bias term is omitted in this example (there is no `np.dot(layer, weights) + bias`).
310
314
311
-
-_Weights_: These are important adjustable parameters that the neural network fine-tunes by forward and backward propagating the data. They are optimized through a process called [gradient descent](https://en.wikipedia.org/wiki/Stochastic_gradient_descent). Before the model training starts, the weights are randomly initialized with NumPy's `np.random.random()` function.
315
+
-_Weights_: These are important adjustable parameters that the neural network fine-tunes by forward and backward propagating the data. They are optimized through a process called [gradient descent](https://en.wikipedia.org/wiki/Stochastic_gradient_descent). Before the model training starts, the weights are randomly initialized with NumPy's [`Generator.random()`](https://numpy.org/doc/stable/reference/random/generated/numpy.random.Generator.random.html).
312
316
313
317
The optimal weights should produce the highest prediction accuracy and the lowest error on the training and test sets.
314
318
@@ -318,7 +322,7 @@ Afterwards, you will construct the building blocks of a simple deep learning mod
318
322
319
323
-_Regularization_: This [technique](https://en.wikipedia.org/wiki/Regularization_(mathematics)) helps prevent the neural network model from [overfitting](https://en.wikipedia.org/wiki/Overfitting).
320
324
321
-
In this example, you will use a method called dropout — [dilution](https://en.wikipedia.org/wiki/Dilution_(neural_networks)) — that randomly sets a number of features in a layer to 0s. You will define it with NumPy's `np.random.randint()` function and apply it to the hidden layer of the network.
325
+
In this example, you will use a method called dropout — [dilution](https://en.wikipedia.org/wiki/Dilution_(neural_networks)) — that randomly sets a number of features in a layer to 0s. You will define it with NumPy's [`Generator.integers()`](https://numpy.org/doc/stable/reference/random/generated/numpy.random.Generator.integers.html) method and apply it to the hidden layer of the network.
322
326
323
327
-_Loss function_: The computation determines the quality of predictions by comparing the image labels (the truth) with the predicted values in the final layer's output.
324
328
@@ -368,10 +372,12 @@ Here is a summary of the neural network model architecture and the training proc
368
372
369
373
Having covered the main deep learning concepts and the neural network architecture, let's write the code.
370
374
371
-
**1.** For reproducibility, initialize a random seed with `np.random.seed()`:
375
+
**1.** We'll start by creating a new random number generator, providing a seed
376
+
for reproducibility:
372
377
373
378
```{code-cell} ipython3
374
-
np.random.seed(1)
379
+
seed = 884736743
380
+
rng = np.random.default_rng(seed)
375
381
```
376
382
377
383
**2.** For the hidden layer, define the ReLU activation function for forward propagation and ReLU's derivative that will be used during backpropagation:
@@ -403,11 +409,11 @@ pixels_per_image = 784
403
409
num_labels = 10
404
410
```
405
411
406
-
**4.** Initialize the weight vectors that will be used in the hidden and output layers with `np.random.random()`:
412
+
**4.** Initialize the weight vectors that will be used in the hidden and output layers with random values:
Copy file name to clipboardExpand all lines: content/tutorial-deep-reinforcement-learning-with-pong-from-pixels.md
+14-4
Original file line number
Diff line number
Diff line change
@@ -264,6 +264,16 @@ Next, you will define the policy as a simple feedforward network that uses a gam
264
264
265
265
1. Let's instantiate certain parameters for the input, hidden, and output layers, and start setting up the network model.
266
266
267
+
Start by creating a random number generator instance for the experiment
268
+
(seeded for reproducibility):
269
+
270
+
```{code-cell}
271
+
272
+
rng = np.random.default_rng(seed=12288743)
273
+
```
274
+
275
+
Then:
276
+
267
277
+++ {"id": "PbqQ3kPBRfvn"}
268
278
269
279
- Set the input (observation) dimensionality - your preprocessed screen frames:
@@ -298,13 +308,13 @@ model = {}
298
308
299
309
In a neural network, _weights_ are important adjustable parameters that the network fine-tunes by forward and backward propagating the data.
300
310
301
-
2. Using a technique called [Xavier initialization](https://www.deeplearning.ai/ai-notes/initialization/#IV), set up the network model's initial weights with NumPy's [`np.random.randn()`](https://numpy.org/doc/stable/reference/random/generated/numpy.random.randn.html) that return random numbers over a standard Normal distribution, as well as [`np.sqrt()`](https://numpy.org/doc/stable/reference/generated/numpy.sqrt.html?highlight=numpy.sqrt#numpy.sqrt):
311
+
2. Using a technique called [Xavier initialization](https://www.deeplearning.ai/ai-notes/initialization/#IV), set up the network model's initial weights with NumPy's [`Generator.standard_normal()`](https://numpy.org/doc/stable/reference/random/generated/numpy.random.Generator.standard_normal.html) that returns random numbers over a standard Normal distribution, as well as [`np.sqrt()`](https://numpy.org/doc/stable/reference/generated/numpy.sqrt.html?highlight=numpy.sqrt#numpy.sqrt):
0 commit comments