You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When running sjSDM_cv in parallel (using parallel::parLapply), it appears that the sub_tune_samples object is not passed to each worker. Instead, each worker seems to handle tune_samples in tune_func() with t always being 1 (i.e. nrow(sub_tune_samples)). Line 230-235:
As a result, it seems to me that the models are trained with identical tuning parameters within each cross-validation fold, leading to identical CV results within each fold regardless of the specified tuning grid.
Reproducible example:
set.seed(42)
community<- simulate_SDM(sites=100, species=10, env=3, se=TRUE)
Env<-community$env_weightsOcc<-community$responseSP<-matrix(rnorm(200, 0, 0.3), 100, 2) # spatial coordinates (no effect on species occurences)tune_cv=sjSDM::sjSDM_cv(Y=Occ,
env=Env,
CV=5,
tune="random",
tune_steps=10L,
step_size=5L,
device="cpu",
n_cores=10L,
sampling=100L,
learning_rate=0.01,
iter=100L)
tune_cv
In sjSDM_cv I suggest adding tune_samples as additional variable to tune_func() as a quick-fix.
When running sjSDM_cv in parallel (using parallel::parLapply), it appears that the sub_tune_samples object is not passed to each worker. Instead, each worker seems to handle tune_samples in tune_func() with t always being 1 (i.e. nrow(sub_tune_samples)).
Line 230-235:
As a result, it seems to me that the models are trained with identical tuning parameters within each cross-validation fold, leading to identical CV results within each fold regardless of the specified tuning grid.
Reproducible example:
In sjSDM_cv I suggest adding tune_samples as additional variable to tune_func() as a quick-fix.
Line 118:
and when applying the function:
Line 215-217:
Line 230-235:
The text was updated successfully, but these errors were encountered: