Skip to content

Commit b5d8d24

Browse files
bobrenjc93pytorchmergebot
authored andcommitted
add README.md for compile time benchmarks (pytorch#143145)
Pull Request resolved: pytorch#143145 Approved by: https://github.com/laithsakka ghstack dependencies: pytorch#141517, pytorch#143143
1 parent b7ad52a commit b5d8d24

File tree

1 file changed

+9
-0
lines changed
  • benchmarks/dynamo/pr_time_benchmarks

1 file changed

+9
-0
lines changed
Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
# Instructions on how to make a new compile time benchmark
2+
3+
1. Make a new benchmark file in /benchmarks/dynamo/pr_time_benchmarks/benchmarks/ eg. https://github.com/pytorch/pytorch/blob/0b75b7ff2b8ab8f40e433a52b06a671d6377997f/benchmarks/dynamo/pr_time_benchmarks/benchmarks/add_loop.py
4+
2. cd into the pr_time_benchmarks directory `cd benchmarks/dynamo/pr_time_benchmarks`
5+
3. Run `PYTHONPATH=./ python benchmarks/[YOUR_BENCHMARK].py a.txt`
6+
4. (Optional) flip a flag that you know will change the benchmark and run again with b.txt `PYTHONPATH=./ python benchmarks/[YOUR_BENCHMARK].py a.txt`
7+
5. Compare `a.txt` and `b.txt` located within the `benchmarks/dynamo/pr_time_benchmarks` folder to make sure things look as you expect
8+
6. Check in your new benchmark file and submit a new PR
9+
7. In a few days, if your benchmark is stable, bug Laith Sakka to enable running your benchmark on all PRs. If your a meta employee, you can find the dashboard here: internalfb.com/intern/unidash/dashboard/pt2_diff_time_metrics

0 commit comments

Comments
 (0)