Skip to content

Commit 1dfc4b2

Browse files
committed
Merge branch 'master' of github.com:csc-training/hpc-python
2 parents b7e4e1b + 25a7feb commit 1dfc4b2

File tree

2 files changed

+17
-12
lines changed

2 files changed

+17
-12
lines changed

cython/heat-equation/profile.md

+13-8
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,21 @@
11
## Example profile for heat equation solver
22

33
```
4-
205595 function calls (201921 primitive calls) in 42.096 seconds
4+
591444 function calls (582598 primitive calls) in 15.498 seconds
55
66
Ordered by: internal time
7-
List reduced from 2229 to 10 due to restriction <10>
7+
List reduced from 3224 to 10 due to restriction <10>
88
99
ncalls tottime percall cumtime percall filename:lineno(function)
10-
200 41.641 0.208 41.679 0.208 evolve.py:3(evolve)
11-
2 0.067 0.033 0.067 0.033 {built-in method write_png}
12-
40015 0.039 0.000 0.039 0.000 {range}
13-
1 0.022 0.022 0.029 0.029 __init__.py:23(<module>)
14-
2 0.020 0.010 0.020 0.010 {built-in method resize}
15-
2245 0.012 0.000 0.012 0.000 {numpy.core.multiarray.array}
10+
200 14.837 0.074 14.837 0.074 heat.py:9(evolve)
11+
2 0.070 0.035 0.070 0.035 {built-in method matplotlib._png.write_png}
12+
241 0.052 0.000 0.052 0.000 {built-in method marshal.loads}
13+
2467 0.023 0.000 0.040 0.000 inspect.py:614(cleandoc)
14+
1052/959 0.023 0.000 0.066 0.000 {built-in method builtins.__build_class__}
15+
33/31 0.018 0.001 0.023 0.001 {built-in method _imp.create_dynamic}
16+
3228 0.014 0.000 0.014 0.000 {built-in method numpy.array}
17+
40000 0.014 0.000 0.017 0.000 npyio.py:771(floatconv)
18+
274/1 0.013 0.000 15.498 15.498 {built-in method builtins.exec}
19+
556 0.011 0.000 0.011 0.000 <frozen importlib._bootstrap>:78(acquire)
20+
1621
```

exercise-instructions.md

+4-4
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ In class room workstations, one needs to load the MPI environment before using
3939
mpi4py:
4040

4141
```
42-
% module load mpi/openmpi-x86_64
42+
% module load mpi
4343
```
4444

4545
After that MPI parallel Python programs can be launched with mpirun, e.g. to
@@ -49,11 +49,11 @@ run with 4 MPI tasks one issues
4949
% mpirun –np 4 python3 example.py
5050
```
5151

52-
In Taito one can launch interactive MPI programs with srun:
52+
In Puhti one can launch interactive MPI programs with srun:
5353

5454
```
5555
% srun -n4 python3 hello.py
5656
```
5757

58-
Note that for real production calculations in Taito one should use batch job
59-
scripts, see https://research.csc.fi/taito-user-guide
58+
Note that for real production calculations in Puhti one should use batch job
59+
scripts, see https://docs.csc.fi/computing/running/getting-started/

0 commit comments

Comments
 (0)