Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Set random seed for unit tests #196

Merged
merged 4 commits into from
Feb 6, 2025
Merged

Conversation

wbenoit26
Copy link
Contributor

Sets a seed for unit testing so that the tests are really just checking whether a PR breaks any existing functionality. A separate project that we should into is the error distribution of all of the probabilistic tests.

This change means that if a future test doesn't work with this seed due to randomness, a new random seed will have to be found for which things pass.

@wbenoit26 wbenoit26 requested a review from EthanMarx February 5, 2025 21:21
Copy link
Contributor

@deepchatterjeeligo deepchatterjeeligo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great idea!

@EthanMarx
Copy link
Collaborator

This is great to have. Can you confirm that this seeds the waveform parameters that are themselves sampled from within fixtures? See conftest.py. If not, might need to rework those such that they're sampled after the seeding is done.

@wbenoit26
Copy link
Contributor Author

Good catch, yeah, this only works for randomness happening within the tests, not within the fixtures. Let me make that work.

Copy link

github-actions bot commented Feb 5, 2025

Coverage report

Click to see where and how coverage changed

FileStatementsMissingCoverageCoverage
(new stmts)
Lines missing
  ml4gw
  __init__.py
  distributions.py
  ml4gw/dataloading
  hdf5_dataset.py
  ml4gw/transforms
  __init__.py
  ml4gw/waveforms
  generator.py
Project Total  

This report was generated by python-coverage-comment-action

@EthanMarx
Copy link
Collaborator

Was thinking about this separately, but would make sense to have those distribution fixtures just return the distriubtion object that can then be sampled from within the fixture.

This would solve the above problem, and also allow us to adjust the number of samples we want to generate based on the specific test

@wbenoit26
Copy link
Contributor Author

I was thinking the opposite - that it would be convenient to have conftest.py contain all these distribution fixtures that various tests can use without needing to include the sampling. For different tests wanting different numbers of samples, we could add a num_samples fixture to individual tests and have each distribution take it as an argument.

@wbenoit26
Copy link
Contributor Author

So in conftest.py, we have:

@pytest.fixture()
def chirp_mass(seed_everything, num_samples):
    dist = Uniform(5, 100)
    return dist.sample(torch.Size((num_samples,)))

And in test_cbc_waveforms.py, we have:

@pytest.fixture()
def num_samples():
    return 100

@EthanMarx
Copy link
Collaborator

Yeah I think that'll work too, just ensure the num_samples is getting propagated properly.

If conftest.py is imported / evaluated before num_samples that might not work.

@wbenoit26
Copy link
Contributor Author

@EthanMarx I switched over to using a num_samples fixture everywhere that was relevant and all of the tests run. Let's wait to make sure all of the tests pass, and then I think this will be good to go.

@EthanMarx
Copy link
Collaborator

Great. Can you quadruple check that num_samples is being appropriately propogated for the different tests?

@wbenoit26
Copy link
Contributor Author

Yeah, that all works as expected

Copy link
Collaborator

@EthanMarx EthanMarx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@wbenoit26 wbenoit26 merged commit 097361d into ML4GW:main Feb 6, 2025
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants