-
-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New features #45
Comments
I would probably start with removing the |
Bonus one is to add |
Yeah, it's pretty essential that the data the model sees at inference time is exactly the same as the data it sees at training time 🙂 so I agree that sounds like the priority! And I agree with @jacobbieker that I don't think XGBoost models require the data to be normalised (because it chops real-valued inputs up into bins). Does the model also get historical NWP data? If not, I think that might help a bit: i.e. if the model gets lagged GSP data for n timesteps in the past, then it might be useful to give the model NWP data for those same timesteps so the model can see the difference between the expected forecast (given the NWP) and what actually happened in the recent past. But maybe the model is already doing that? |
Thanks @JackKelly and @jacobbieker , i re-ordered above, do you that order is about right? |
Lgtm! |
Looks great! |
Thanks, @dantravers you happy with this? |
Looks reasonable to me! I'd be curious to see if this does well, so could be higher? |
sde
from the training - remove sde and save to s3 #48hcct
from the training - retrain without hcct #51mcc
andhcc
to nwp variables - Issue/mcc lcc #52n_estimators
- change estimators to 1250 #53smoother
to resultsWould be interested to think what people think I should do first?
@JackKelly @jacobbieker @dantravers
The text was updated successfully, but these errors were encountered: