🖥️ Interactive Shell: Command Ray nodes with a familiar shell interface
🎲 Smart Connection: Automatically connects to worker nodes or specific targets
🌐 Remote Mode: Work on remote with your local working directory, like on your local machine
🚀 Job Submission: Run Python/Bash files as Ray jobs, like on your local machine
First, figure out your Ray cluster's Ray version and Python version - we need to align them.
Then, install RaySSH using uv, so that it runs in a dedicated environment that aligns with your Ray cluster.
# Install RaySSH as a tool with the Ray executables and a specific Python
uv tool install \
--with-executables-from "ray[default]==<target_cluster_ray_version>" \
--python <target_cluster_python_version> \
git+https://github.com/kivenchen/RaySSH.gitAfter installation, you can use the rayssh command from your terminal directly.
Basic Commands:
rayssh— Random worker connectionrayssh -l— Interactive node selectionrayssh <ip|node_id|-index>— Connect to a specific noderayssh <dir>— Remote mode with directory uploadrayssh <file>— Submit and run file as a Ray jobrayssh lab [-q] [path]— Launch JupyterLab; optional upload pathrayssh code [-q] [path]— Launch code-server (VS Code); optional upload pathrayssh --ls— Print nodes table
Configuration:
export RAY_ADDRESS=ray://remote-cluster-host:port
# 1) Configure once for your cluster
export RAY_ADDRESS=ray://remote-cluster-host:10001
# 2) Connect and inspect
rayssh # random worker
> ray status
# 3) Remote dev with upload, then open VS Code in the browser
rayssh code -q ~/my-project # uploads path, prints URL, exits (server keeps running)
**Set up remote development environment:**
```bash
# One-time setup
export RAY_ADDRESS=ray://gpu-cluster.company.com:10001Then work remotely like you're local:
cd ~/machine-learning-project
# Manage what files to upload, and what to ignore
echo "*.parquet" >> .gitignore
# You can also customize your runtime environment in a runtime_env.yaml
vim runtime_env.yaml
# Upload your project and start working
rayssh sync .
# Your project files are now uploaded to remotely
> ls # See your uploaded files
> mount -t nfs 192.168.1.100:/workspace/datasets /mnt/datasets
> vim train_config.py # Edit remote copies of your files, which syncs back to local automatically
> python train.py # Run training on clusterrayssh -l # choose node interactively rayssh lab -q # prints URL, exits (server keeps running)
## Interactive Shell Features
Once connected, you get a full shell experience - do whatever you'd like
## Configuration
**Environment Variables:**
```bash
# Remote cluster connection
export RAY_ADDRESS=ray://cluster:10001
# Ray configuration (optional)
export RAY_CLIENT_RECONNECT_GRACE_PERIOD=60
- Python >= 3.8
- Ray >= 2.0.0
- Network access to your Ray cluster
- For "rayssh " job submission, access to head node is sufficient.
- For other features, access to worker nodes is also required.
Performance:
- Upload only necessary files for better performance. Use ".gitignore" to exclude files from working dir.
MIT License - see LICENSE file for details.