Skip to content

Commit 86cd6eb

Browse files
scottyhqe-marshall
andauthored
Add instructions for executing tutorial notebooks on CryoCloud JupyterHub (#41)
* jupyterhub instructions * wording change --------- Co-authored-by: e-marshall <[email protected]>
1 parent dc5defd commit 86cd6eb

File tree

3 files changed

+53
-343
lines changed

3 files changed

+53
-343
lines changed

book/background/software.md

+52-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# 2.5 Software and computing environment
22

3-
On this page you'll find information about the computing environment and datasets that will be used in both of the tutorials in this book.
3+
On this page you'll find information about the computing environment that will be used in both of the tutorials in this book. We provide instructions for Running locally (on laptop), or on a hosted JupyterHub in AWS us-west-2.
44

55
## *Running tutorial materials locally*
66

@@ -17,6 +17,8 @@ There are two options for creating a software environment: [pixi](https://pixi.s
1717
```pixi run itslive```
1818
```pixi run sentinel1```
1919

20+
Note that the first `pixi run` will download specific versions of all required Python libraries to a hidden directory `./.pixi`. Subsequent runs activate that environment and execute code within it. You can also run `pixi shell` to "activate" the environment (set paths to executables and auxiliary files) and `exit` to deactivate it.
21+
2022
### To use conda/mamba
2123

2224
1. Clone this book's GitHub repository:
@@ -32,3 +34,52 @@ There are two options for creating a software environment: [pixi](https://pixi.s
3234
```jupyterlab```
3335

3436
Both tutorials use functions that are stored in scripts associated with each dataset. You can find these scripts here: [`itslive_tools.py`](../itslive/nbs/itslive_tools.py) and [`s1_tools.py`](../s1/nbs/s1_tools.py).
37+
38+
39+
## *Running tutorial materials on a hosted JupyterHub*
40+
41+
Many NASA datasets including ITS_LIVE are hosted in the AWS us-west-2 data center. While these tutorial notebooks are designed to be run on any computer, if you intend to modify the notebooks and access data directly it is desireable to run computations in the same data center. A convenient way access a computer in AWS us-west-2 is to use a hosted JupyterHub platform such as one of the following:
42+
43+
- https://docs.openveda.cloud/user-guide/scientific-computing/
44+
- https://opensarlab-docs.asf.alaska.edu
45+
- https://book.cryointhecloud.com/content/Getting_Started.html
46+
47+
On these systems you can install software environment in the same way described above, but you must make the default JupyterLab interface aware of your environment. In Jupyter terminology you must specify a 'kernel'. Unfortunately there is not an automatic and uniform way of doing this, but a few manual steps can be followed:
48+
49+
1. Create a kernel specification subfolder under your home directory:
50+
```
51+
mkdir -p /home/jovyan/.local/share/jupyter/kernels/pixi/
52+
```
53+
54+
2. Use a text editor to add the following JSON to a `kernel.json` file in the directory we created above (`/home/jovyan/.local/share/jupyter/kernels/pixi/kernel.json`):
55+
```
56+
{
57+
"argv": [
58+
"/home/jovyan/.pixi/bin/pixi",
59+
"run",
60+
"python",
61+
"-m",
62+
"ipykernel_launcher",
63+
"-f",
64+
"{connection_file}"
65+
],
66+
"display_name": "Pixi (default)",
67+
"language": "python",
68+
"metadata": {
69+
"debugger": true
70+
}
71+
}
72+
```
73+
74+
Once created you should see this kernel listed in the output of `jupyter kernelspec list`:
75+
```
76+
Available kernels:
77+
python3 /srv/conda/envs/notebook/share/jupyter/kernels/python3
78+
pixi /home/jovyan/.local/share/jupyter/kernels/pixi
79+
```
80+
81+
Finally, you may need to reload your web browser in order to see 'Pixi (default)' as an optional kernel to select when you open one of the Jupyter Notebooks in this repository. With 'Pixi (default)' as the selected kernel code in the notebook will use the environment defined in your `.pixi` folder!
82+
83+
84+
85+

0 commit comments

Comments
 (0)