You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
perf: avoid graph break for SiLUT when inferring (#4790)
This pull request simplifies and optimizes the implementation of the
`forward` method in the `ActivationFn` class within
`deepmd/pt/utils/utils.py`. The changes streamline the logic by removing
unnecessary condition checks and directly using `torch.where` for
computation.
I've evaluated this change using inference efficiency tasks from
LAMBench with DPA 3.1 3M model.
| System | Before: Avg Time ± Std (s) | After: Avg Time ± Std (s) |
Speedup | Success Rate |
|-------------------|---------------------------|--------------------------|---------|--------------|
| `catalysts_500.traj` | 211.82 ± 19.31 | **196.14 ± 18.11** | +7.1% |
100.0% |
| `inorganic_500.traj` | 204.62 ± 40.22 | **191.20 ± 36.44** | +6.4% |
100.0% |
<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit
- **Refactor**
- Improved the internal logic of the SiLU activation function for more
streamlined processing. No changes to user-facing functionality.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
0 commit comments