|
1 | 1 | # Linaro Forge (DDT) debugger
|
2 | 2 |
|
3 |
| -- https://www.linaroforge.com/downloadForge |
4 |
| - |
5 |
| -Linaro Forge (formerly known as DDT) allows source-level debugging of Fortran, |
| 3 | +[Linaro Forge](https://www.linaroforge.com/downloadForge) (formerly known as DDT) allows source-level debugging of Fortran, |
6 | 4 | C, C++ and Python codes. It can be used for debugging serial, multi-threaded
|
7 | 5 | (OpenMP), multi-process (MPI) and accelerated (Cuda, OpenACC) programs running
|
8 | 6 | on research and production systems, including CSCS Alps system. It can be
|
9 | 7 | executed either as a graphical user interface or from the command-line.
|
10 | 8 |
|
11 |
| -## Using the debugger |
| 9 | +## Usage notes |
| 10 | + |
| 11 | +The uenv is named `linaro-forge`, and the available versions on a cluster can be determined using the `uenv image find` command, for example: |
| 12 | +``` |
| 13 | +> uenv image find linaro-forge |
| 14 | +uenv/version:tag uarch date id size |
| 15 | +linaro-forge/23.1.2:latest gh200 2024-04-10 ea67dbb33801c7c3 342MB |
| 16 | +``` |
| 17 | + |
| 18 | + |
| 19 | +The linaro tools are configured to be mounted in the `/user-tools` path so that they can be used alongside application and development uenv mounted at `user-environment`. |
| 20 | + |
| 21 | +=== "sidecar" |
| 22 | + |
| 23 | + When using alongside another uenv, start a uenv session with both uenv with `linaro-forge` after the main uenv, to mount the images at the respective `/user-environment` and `/user-tools` locations: |
| 24 | + |
| 25 | + ```bash |
| 26 | + uenv start prgenv-gnu/24.2:v3 linaro-forge/32.1.2 |
| 27 | + |
| 28 | + # test that everything has been mounted correctly |
| 29 | + # (will give warnings if there are problems) |
| 30 | + uenv status |
| 31 | + |
| 32 | + uenv view prgenv-gnu:default |
| 33 | + source /user-tools/acvitave |
| 34 | + |
| 35 | + # check that ddt is in the path |
| 36 | + ddt --version |
| 37 | + ``` |
| 38 | + |
| 39 | + The `/user-tools/activate` script will make the forge executables available in your environment, and **must be run after** any other uenv view command. |
| 40 | + |
| 41 | +=== "standalone" |
| 42 | + |
| 43 | + When using the uenv with no other environment mounted, you will need to explicitly set the `/user-tools` mount point: |
| 44 | + |
| 45 | + ```bash |
| 46 | + uenv start linaro-forge/32.1.2:/user-tools |
| 47 | + |
| 48 | + source /user-tools/acvitave |
| 49 | + |
| 50 | + # check that ddt is in the path |
| 51 | + ddt --version |
| 52 | + ``` |
| 53 | + |
| 54 | + The `/user-tools/activate` script will make the forge executables available in your environment. |
| 55 | + |
| 56 | +## Getting Started |
12 | 57 |
|
13 | 58 | In order to debug your code on Alps, you need to:
|
14 | 59 |
|
15 |
| -- install the Forge/DDT client on your laptop, |
16 |
| -- setup the user environment on Alps, |
17 |
| -- build an executable with debug flags on Alps, |
| 60 | +1. pull the linaro-forge uenv on the target Alps vCluster |
| 61 | +- install the Forge/DDT client on your laptop |
| 62 | +- build an executable with debug flags |
18 | 63 | - launch a job with the debugger on Alps.
|
19 | 64 |
|
| 65 | +### Pull the Linaro Forge uenv on the Alps cluster |
| 66 | + |
| 67 | +The first step is to download the latest version of linaro-forge that is available on the cluster. |
| 68 | +First, SSH into the target system, then use the `uenv image find` command to list the available versions on the system: |
| 69 | + |
| 70 | +``` |
| 71 | +> uenv image find linaro-forge |
| 72 | +uenv/version:tag uarch date id size |
| 73 | +linaro-forge/23.1.2:latest gh200 2024-04-10 ea67dbb33801c7c3 342MB |
| 74 | +``` |
| 75 | + |
| 76 | +In this example, there is a single version available. Next we pull the image so that it is available locally. |
| 77 | +``` |
| 78 | +> uenv image pull linaro-forge/23.1.2:latest |
| 79 | +``` |
| 80 | + |
| 81 | +It will take a few seconds to download the image. Once complete, check that it was downloaded using the `uenv image ls` command: |
| 82 | + |
| 83 | +``` |
| 84 | +> uenv image ls linaro-forge |
| 85 | +uenv/version:tag uarch date id size |
| 86 | +linaro-forge/23.1.2:latest gh200 2024-04-05 ea67dbb33801c7c3 342MB |
| 87 | +``` |
20 | 88 |
|
21 | 89 | ### Install the client on your laptop
|
22 | 90 |
|
23 |
| -We recommend to download and install the [desktop client](https://www.linaroforge.com/downloadForge) on your local workstation/laptop. It will connect with the debug jobs running on Alps, offering a better user experience compared to opening ddt with X11 forwarding. The client can be downloaded for a selection of operating systems. |
| 91 | +We recommend installing the [desktop client](https://www.linaroforge.com/downloadForge) on your local workstation/laptop. |
| 92 | +It can be configured to connect with the debug jobs running on Alps, offering a better user experience compared running remotely with X11 forwarding. |
| 93 | +The client can be downloaded for a selection of operating systems, via the link above. |
24 | 94 |
|
25 |
| -Once installed, the client needs to be configured to connect to your preferred vcluster. For this, launch the client: |
| 95 | +Once installed, the client needs to be configured to connect to the vCluster on which you are working. |
| 96 | +First, start the client on your laptop. |
26 | 97 |
|
27 |
| -- mac: open /Applications/Linaro\ Forge\ Client\ 23.0.1.app/ |
28 |
| -- linux: $HOME/linaro/forge/23.0.1/bin/ddt |
| 98 | +=== "Linux" |
29 | 99 |
|
30 |
| -and setup the connection: |
| 100 | + The path will change if you have installed a different version, or if it has been installed in a non-standard installation location. |
31 | 101 |
|
32 |
| -``` |
33 |
| -- open the 'Remote Launch' menu and click on 'configure' then 'Add' and set the fields, for example: |
34 |
| - - Connection Name: alps |
| 102 | + ```bash |
| 103 | + $HOME/linaro/forge/23.0.1/bin/ddt |
| 104 | + ``` |
35 | 105 |
|
36 |
| - |
37 |
| - # Note that the clariden vlcuster name can be replaced with another vcluster name |
| 106 | +=== "MacOS" |
38 | 107 |
|
39 |
| - - Remote install dir: uenv run IMG -- DDTDIR |
40 |
| - # here we tell the client to use the ddt installed in the uenv image |
41 |
| -``` |
| 108 | + The path will change if you have installed a different version, or if it has been installed in a non-standard installation location. |
42 | 109 |
|
43 |
| -where you can replace `IMG` and `DDTDIR` with for example: |
| 110 | + ```bash |
| 111 | + open /Applications/Linaro\ Forge\ Client\ 23.0.1.app/ |
| 112 | + ``` |
44 | 113 |
|
45 |
| -- `IMG`: full path to the uenv file and mount point, for example: |
46 |
| - - _/scratch/e1000/your-cscs-username-here/linaro-forge-23.0.3.squashfs:/user-tools_ |
47 |
| -- `DDTDIR`: full path to the tool, for example: |
48 |
| - - _/user-tools/linux-sles15-zen2/gcc-11.3.0/linaro-forge-23.0.3-3z4k6ijkcxcgqymv6mapv6xaela7m2q5/_ |
| 114 | +Next, configure a connection to the target system. |
| 115 | +Open the *Remote Launch* menu and click on *configure* then *Add*. Examples of the settings are below. |
49 | 116 |
|
50 |
| -and |
| 117 | +=== "Eiger" |
51 | 118 |
|
52 |
| -``` |
53 |
| - - Remote Script: |
| 119 | + | Field | Value | |
| 120 | + | ----------- | --------------------------------------- | |
| 121 | + | Connection | `eiger` | |
| 122 | + |
| 123 | + | Remote Installation Directory | `uenv run linaro-forge/23.1.2:/user-tools -- /user-tools/env/forge/` | |
| 124 | + | Private Key | `$HOME/.ssh/cscs-key` | |
54 | 125 |
|
55 |
| - - Private Key: _path-to-your-home_/.ssh/cscs-key |
| 126 | +=== "Santis" |
56 | 127 |
|
57 |
| - - Proxy through login node: yes (check the box) |
58 |
| -``` |
| 128 | + | Field | Value | |
| 129 | + | ----------- | --------------------------------------- | |
| 130 | + | Connection | `santis` | |
| 131 | + |
| 132 | + | Remote Installation Directory | `uenv run linaro-forge/23.1.2:/user-tools -- /user-tools/env/forge/` | |
| 133 | + | Private Key | `$HOME/.ssh/cscs-key` | |
59 | 134 |
|
60 |
| -Click `Test Remote Launch`. If the client can connect, you are ready to debug: |
61 |
| -click on `ok` and `close` (to save the configuration). You can now connect by going to `Remote Launch` and choose the `Alps` entry. If the client fails to connect, look at the message, check your ssh configuration and make sure you can ssh without the client. |
62 | 135 |
|
63 |
| -### Setup the environment |
| 136 | +Some notes on the examples above: |
64 | 137 |
|
65 |
| -`linaro-forge-23.0.3.squashfs` provides the latest version of Linaro Forge (23.0.3). |
| 138 | +* SSH Forwarding via `ela.scscs.ch` is used to access the cluster. |
| 139 | +* the replace the username `bsmith` with your CSCS user name that you would normally use to open an SSH connection to CSCS. |
| 140 | +* the Remote Installation Path is a little bit more complicated than |
| 141 | +* the private keys should be the ones generated for CSCS MFA, and this field does not need to be set if you have added the key to your SSH agent. |
66 | 142 |
|
67 |
| -- On Alps: |
68 |
| -```bash |
69 |
| -uenv start ./linaro-forge-23.0.3.squashfs |
70 |
| -uenv modules use |
71 |
| -module load linaro-forge |
72 |
| -ddt --version |
73 |
| -# Version: 23.0.3 |
74 |
| -``` |
| 143 | +Once configured, test and save the configuration: |
| 144 | + |
| 145 | +1. check whether the concfiguration is correct, click `Test Remote Launch`. |
| 146 | +2. Click on `ok` and `close` to save the configuration. |
| 147 | +3. You can now connect by going to `Remote Launch` and choose the `Alps` entry. If the client fails to connect, look at the message, check your ssh configuration and make sure you can ssh without the client. |
| 148 | + |
| 149 | +### Setup the environment |
75 | 150 |
|
76 | 151 | ### Build with debug flags
|
77 | 152 |
|
78 | 153 | Once the uenv is loaded and activated, the program to debug must be compiled with the `-g` (for cpu) and `-G` (for gpu) debugging flags. For example, let's build a cuda code with a user environment:
|
79 | 154 |
|
80 |
| -- on Alps: |
81 | 155 | ```bash
|
82 |
| -uenv start store.squashfs |
83 |
| -uenv modules use |
84 |
| -module load gcc cray-mpich cuda |
85 |
| -git clone -b ddt https://github.com/jgphpc/octree-miniapp \ |
86 |
| -octree-miniapp.git |
| 156 | +uenv start prgenv-gnu:24.2:v2 |
| 157 | +uenv view default |
| 158 | + |
| 159 | +# download the source code |
| 160 | +git clone https://github.com/sekelle/octree-miniapp.git |
| 161 | +cd o |
| 162 | + |
| 163 | + |
| 164 | +# build the application |
87 | 165 | make -C octree-miniapp.git/
|
88 | 166 | ```
|
89 | 167 |
|
90 | 168 | ### Launch the code with the debugger
|
91 | 169 |
|
92 |
| -Given the unusual way of loading the uenv, the DDT client must be launched in `Manual Launch` mode (assuming that it is connected to Alps via `Remote Launch`): |
| 170 | +To use the DDT client with uenv, it must be launched in `Manual Launch` mode (assuming that it is connected to Alps via `Remote Launch`): |
93 | 171 |
|
94 |
| -- on the client: |
95 |
| -``` |
96 |
| -- open the 'Manual Launch' menu and |
97 |
| -- set the fields, for example: |
98 |
| - - Number of processes: 12 |
99 |
| - - CUDA: yes (check the box for gpu exeutables) |
100 |
| -``` |
101 |
| -Listen and wait |
| 172 | +??? note |
102 | 173 |
|
103 |
| -You can then launch ddt with the srun command (or a Slurm jobscript): |
| 174 | + the steps below do not manually launch - instead they directly launch using `ddt --connect srun ...` on the target cluster. |
104 | 175 |
|
105 |
| -- on Alps: |
106 |
| -```bash |
107 |
| -unset CUDA_VISIBLE_DEVICES |
108 |
| -srun --uenv=$UENV_SQFS,TOOL_SQFS \ |
109 |
| --l -N3 -n12 -t10 -pnvgpu \ |
110 |
| -./octree-miniapp.git/cuda_visible_devices.sh \ |
111 |
| -$DDT_CLIENT |
112 |
| -./octree-miniapp.git/neighbor_search.exe 120000 |
113 |
| -``` |
| 176 | +=== "on laptop" |
| 177 | + |
| 178 | + Start DDT, and connect to the target cluster using the drop down menu for Remote Launch. |
| 179 | + |
| 180 | + Then wait for the job to start (see the "on Alps" tab). |
| 181 | + |
| 182 | +=== "on Alps" |
| 183 | + |
| 184 | + log into the system and launch with the srun command: |
| 185 | + |
| 186 | + ```bash |
| 187 | + # start a session with both the PE used to build your application |
| 188 | + # and the linaro-forge uenv mounted |
| 189 | + uenv start prgenv-gnu/24.2 linaro-forge/23.1.2 |
| 190 | + ddt --connect srun -n2 -N2 ./a.out |
| 191 | + ``` |
| 192 | + |
| 193 | +Notes on using specific systems: |
| 194 | + |
| 195 | +=== "santis" |
| 196 | + |
| 197 | + !!! warning |
| 198 | + |
| 199 | + Because Santis is not connected to the internet, some environment variables need to be set so that it can connect to the license server. |
114 | 200 |
|
115 |
| -where for example: |
| 201 | + ```bash |
| 202 | + export https_proxy=proxy.cscs.ch:8080 |
| 203 | + export http_proxy=proxy.cscs.ch:8080 |
| 204 | + export no_proxy=".local, .cscs.ch, localhost, 148.187.0.0/16, 10.0.0.0/8, 172.16.0.0/12" |
| 205 | + ``` |
116 | 206 |
|
117 |
| -- UENV_SQFS=$PWD/store.squashfs:/user-environment |
118 |
| -- TOOL_SQFS=$PWD/linaro-forge-23.0.3.squashfs:/user-tools |
119 |
| -- DDT_CLIENT=/user-tools/linux-sles15-zen2/gcc-11.3.0/linaro-forge-23.0.3-3z4k6ijkcxcgqymv6mapv6xaela7m2q5/bin/ddt-client |
| 207 | + ???- note "default value of `http_proxy`" |
120 | 208 |
|
| 209 | + By default the `https_proxy` and `http_proxy` variables are set to `http://proxy.cscs.ch:8080`, as the transport is required for some other services to work. You will have to set them for a debugging session. |
121 | 210 |
|
122 | 211 | This screenshot shows a debugging session on 12 gpus:
|
123 | 212 |
|
124 |
| - |
| 213 | + |
0 commit comments