Skip to content

Commit b7c526b

Browse files
committed
Minor improvements
1 parent 69891ef commit b7c526b

File tree

2 files changed

+12
-8
lines changed

2 files changed

+12
-8
lines changed

pytorch_tutorial/logistic_regression/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -109,6 +109,9 @@ This model has two inputs (the x- and y-coordinates of a sample) and as many out
109109
```python
110110
# Create a logistic regression model for the 2D dataset
111111
model = nn.Linear(in_features=2, out_features=output_dim).to(device)
112+
113+
# Print model architecture
114+
print(model)
112115
```
113116

114117
### Parameter count
@@ -118,9 +121,6 @@ The number of parameters for this model is equal to the number of entries multip
118121
> The `get_parameter_count()` utility function was defined in a [previous example](../linear_regression/README.md#parameter-count).
119122
120123
```python
121-
# Print model architecture
122-
print(model)
123-
124124
# Compute and print parameter count
125125
n_params = get_parameter_count(model)
126126
print(f"Model has {n_params} trainable parameters")

pytorch_tutorial/multilayer_perceptron/README.md

Lines changed: 9 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ math: true # Use default Marp engine for math rendering
1515

1616
## Scope and objective
1717

18-
This example trains a MultiLayer Perceptron (a feedforward neural network with one hidden layer) to classify 2D data. The complete sourse code is available [here](test_multilayer_perceptron.py).
18+
This example trains a MultiLayer Perceptron (a feedforward neural network with one hidden layer) to classify 2D data. It is designed to mimic the experience of the [TensorFlow Playground](https://playground.tensorflow.org/#activation=tanh&batchSize=5&dataset=circle&regDataset=reg-plane&learningRate=0.1&regularizationRate=0&noise=0&networkShape=3&seed=0.94779&showTestData=false&discretize=false&percTrainData=50&x=true&y=true&xTimesY=false&xSquared=false&ySquared=false&cosX=false&sinX=false&cosY=false&sinY=false&collectStats=false&problem=classification&initZero=false&hideText=false)). The complete sourse code is available [here](test_multilayer_perceptron.py).
1919

2020
![Training outcome](images/multilayer_perceptron.png)
2121

@@ -105,7 +105,11 @@ assert n_batches == math.ceil(n_samples / batch_size)
105105

106106
## Model definition
107107

108-
A PyTorch model is defined by combining elementary blocks, known as *modules*. Here, we use the [Sequential](https://pytorch.org/docs/stable/generated/torch.nn.Sequential.html) class as a container for these blocks. Its ouput is a scalar value squashed into the $[0,1]$ range by the [Sigmoid](<https://pytorch.org/docs/stable/generated/torch.nn.Sigmoid.html>) activation function.
108+
A PyTorch model is defined by combining elementary blocks, known as *modules*.
109+
110+
Here, we use the [Sequential](https://pytorch.org/docs/stable/generated/torch.nn.Sequential.html) class as a container of [Linear](https://pytorch.org/docs/stable/generated/torch.nn.Linear.html) layers. The model ouput is a scalar value squashed into the $[0,1]$ range by the [Sigmoid](<https://pytorch.org/docs/stable/generated/torch.nn.Sigmoid.html>) activation function.
111+
112+
### Model implementation
109113

110114
```python
111115
# Create a MultiLayer Perceptron with 2 inputs, a hidden layer and 1 output
@@ -119,6 +123,9 @@ model = nn.Sequential(
119123
# Activation function for the output layer
120124
nn.Sigmoid(),
121125
).to(device)
126+
127+
# Print model architecture
128+
print(model)
122129
```
123130

124131
### Parameter count
@@ -128,9 +135,6 @@ The total number of parameters for this model is obtained by summing the paramet
128135
> The `get_parameter_count()` utility function was defined in a [previous example](../linear_regression/README.md#parameter-count).
129136
130137
```python
131-
# Print model architecture
132-
print(model)
133-
134138
# Compute and print parameter count
135139
n_params = get_parameter_count(model)
136140
print(f"Model has {n_params} trainable parameters")

0 commit comments

Comments
 (0)