Skip to content

Commit 496333a

Browse files
authored
Merge pull request #13 from ablaom/pluto-again
Add pluto tutorial for 04
2 parents 26eb334 + b224c44 commit 496333a

7 files changed

+1463
-1425
lines changed

notebooks/04_tuning/gamma_sampler.png

512 Bytes
Loading
-2.68 KB
Loading

notebooks/04_tuning/notebook.ipynb

Lines changed: 1279 additions & 1318 deletions
Large diffs are not rendered by default.

notebooks/04_tuning/notebook.jl

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -78,6 +78,7 @@ iterator(r, 5)
7878
#-
7979

8080
using Plots
81+
gr(size=(490,300))
8182
_, _, lambdas, losses = learning_curve(mach,
8283
range=r,
8384
resampling=CV(nfolds=6),
@@ -142,9 +143,10 @@ r = range(model,
142143
unit=5,
143144
scale=:log10)
144145

145-
# The `scale` in a range makes no in a `RandomSearch` (unless it is a
146-
# function) but this will effect later plots but it does effect the
147-
# later plots.
146+
# The `scale` in a range is ignored in a `RandomSearch`, unless it is a
147+
# function. (It *is* relevant in a `Grid` search, not demonstrated
148+
# here.) Note however, the choice of scale *does* effect how later plots
149+
# will look.
148150

149151
# Let's see what sampling using a Gamma distribution is going to mean
150152
# for this range:
@@ -189,8 +191,6 @@ predict(tuned_mach, rows=1:3)
189191
rep = report(tuned_mach);
190192
rep.best_model
191193

192-
# By default, sampling of a bounded range is uniform. Lets
193-
194194
# In the special case of two-parameters, you can also plot the results:
195195

196196
plt = plot(tuned_mach)

0 commit comments

Comments
 (0)