-
Notifications
You must be signed in to change notification settings - Fork 249
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[PT2] FBC #3258
[PT2] FBC #3258
Conversation
weight_port_ids = [3] | ||
bias_port_id = 4 | ||
|
||
if is_experimental_torch_tracing_enabled(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm confuse that the same function from the PyTorch has different signature depends on the trace method. Maybe this is different function and you should reuse the approach from #3237.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Patched function traced only torch.nn.functional.batch_norm (that inside call torch.batch_norm, that have differ signature), that works only because in models most other used batchnorm as module nn.BatchNorm2d, that call used torch.nn.functional.batch_norm.
In experimental traced only torch.batch_norm.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@anzr299, could handle this case in your PR?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I can handle this case in my PR
nncf/experimental/torch2/function_hook/nncf_graph/nncf_graph_builder.py
Outdated
Show resolved
Hide resolved
tests/torch2/function_hook/quantization/test_fast_bias_correction.py
Outdated
Show resolved
Hide resolved
b83c10b
to
6b1e982
Compare
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
weight_port_ids = [3] | ||
bias_port_id = 4 | ||
|
||
if is_experimental_torch_tracing_enabled(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@anzr299, could handle this case in your PR?
### Changes - Impanated FBC for experimental tracing - Save graph to GraphModelWrapper, to using graph like for NNCFNetwork - Add ConstantLayerAttributes to constant node - Check is_experimental_torch_tracing_enabled inside patch_torch_operators, to support [optimum-intel ](https://github.com/huggingface/optimum-intel/blob/f601b8b1fda4477ed9f9e4293cf3c9d5cec4ad1b/optimum/intel/openvino/__init__.py#L49) ### Related tickets 152996 --------- Co-authored-by: Alexander Dokuchaev <[email protected]>
Changes
Related tickets
152996