Skip to content

[DO NOT MERGE] Demo js divergence compared to kl divergence#11744

Draft
billdirks wants to merge 7 commits intodevelopfrom
f/_/demo-js-divergence
Draft

[DO NOT MERGE] Demo js divergence compared to kl divergence#11744
billdirks wants to merge 7 commits intodevelopfrom
f/_/demo-js-divergence

Conversation

@billdirks
Copy link
Copy Markdown
Contributor

  • Description of PR changes above includes a link to an existing GitHub issue
  • PR title is prefixed with one of: [BUGFIX], [FEATURE], [DOCS], [MAINTENANCE], [CONTRIB], [MINORBUMP]
  • Code is linted - run invoke lint (uses ruff format + ruff check)
  • Appropriate tests and docs have been updated

For more information about contributing, visit our community resources.

After you submit your PR, keep the page open and monitor the statuses of the various checks made by our continuous integration process at the bottom of the page. Please fix any issues that come up and reach out on Slack if you need help. Thanks for contributing!

@netlify
Copy link
Copy Markdown

netlify bot commented Mar 24, 2026

Deploy Preview for niobium-lead-7998 canceled.

Name Link
🔨 Latest commit 55997a3
🔍 Latest deploy log https://app.netlify.com/projects/niobium-lead-7998/deploys/69c2db9d368e650008e7d7d4

import numpy as np
import pandas as pd
from scipy import stats
from scipy.spatial.distance import jensenshannon
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Previously stats was used to compute kl divergence, now we use jensenshannon instead.

@codecov
Copy link
Copy Markdown

codecov bot commented Mar 24, 2026

⚠️ JUnit XML file not found

The CLI was unable to find any JUnit XML files to upload.
For more help, visit our troubleshooting guide.

"The maximum KL divergence to for which to return success=True. If KL divergence is larger"
"than the provided threshold, the test will return success=False."
)
INTERNAL_WEIGHT_HOLDOUT_DESCRIPTION = (
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The changes in the top part of this file (until the functional change) is about removing the internal weight and tail weight holdout. In KL divergence, we add these fake weights to prevent dividing by 0. We don't need to worry about this in JS divergence.

Copy link
Copy Markdown
Contributor Author

@billdirks billdirks Mar 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We would want to keep kl divergence and add js divergence as a new file. I'm showing this as a replacement so we can see the diff between these files.

This was all AI generated as a demo of the work. There is 1 line (see comment below) where we compute js instead of kl divergence. The other changes are from removing the weight holdout parameters which we don't need with js divergence or rendering helper changes.

observed_value = None
else:
observed_value = kl_divergence
js_divergence = jensenshannon(pk, qk) ** 2
Copy link
Copy Markdown
Contributor Author

@billdirks billdirks Mar 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the functional change where we compute js divergence instead of kl divergence. Everything else is dealing with adding the "weight holdouts" to prevent dividing by 0 or in the rendering helpers below.


return return_obj

# ---- Rendering helpers ----
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't look at the changes in the rendering section.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant