Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A question on Clipping Flow values #7

Open
MohitLamba94 opened this issue Nov 24, 2024 · 0 comments
Open

A question on Clipping Flow values #7

MohitLamba94 opened this issue Nov 24, 2024 · 0 comments

Comments

@MohitLamba94
Copy link

Hello,

I noticed you comment mentioning that Clipping Flow values helped you a lot and default value you set to (-3,3),

clip_flow_values: Tuple[float, float] = (-3., 3)

My question is about the range of Clipping values for the Flow.

Let,

gt_flow = X1 - N

where X1 is a image form desired distribution say CelebA dataset normalised in the range [-1,1].
And N is a sample drawn from a Normal distribution. So mostly N will be in the range [-3,3] but theoretically its unbounded so lets say if we collect statistics over several steps most samples will be in the range [-4,4].

So now if we compute max and min of gt_flow which is what network must predict, they are

gt_flow_max = 1 - (-4) = 5
gt_flow_min = -1 - 4 = -5

So just wanted to check do you prefer keeping clip_flow_values = (-3,-3) because your result on Oxford Flower dataset look pretty good, or you keep something more?

If my understanding is correct, this flow clipping happens only during inference and not during training so training is not getting affected.

But even during inference will clipping in the (-3,3) hurt given that predicted flow might need to be (-5,5) or at least (-4,4)?

Thankyou

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant