Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bugfix/type safety #19

Merged
merged 9 commits into from
Jun 20, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ pip install nada-ai
```

### From Sources
You can install the nada-algebra library using Poetry:
You can install the nada-numpy library using Poetry:

```bash
git clone https://github.com/NillionNetwork/nada-ai.git
Expand Down
9 changes: 7 additions & 2 deletions examples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,12 @@ The following are the currently available examples:
- [Complex Model](./complex_model): shows how to build more intricate model architectures using Nada AI. Contains convolutions, pooling operations, linear layers and activations
- [Time Series](./time_series): shows how to run a Facebook Prophet time series forecasting model using Nada AI
- [Spam Detection Demo](./spam_detection): shows how to build a privacy-preserving spam detection model using Nada AI. Contains Logistic Regression, and cleartext TF-IDF vectorization.
- [Multi layer Perceptron Demo](./multi_layer_perceptron): shows how to build a privacy-preserving medical image classification model using Nada AI. Features Convolutional Neural Network logic.

The Nada program source code is stored in `src/<EXAMPLE_NAME>.py`.
In order to run an example, simply:
1. Navigate to the example folder `cd <EXAMPLE_NAME>`
2. Build the program via `nada build`
3. (Optional) Test the program via `nada test`
4. Run the example / demo. This will either be a Python script you can run via `python3 main.py` or a Jupyter notebook where you can just run the cells.

In order to follow the end-to-end example, head to `network/compute.py`. You can run it by simply running `nada build` to build the Nada program followed by `python network/compute.py`.
The Nada program source code is stored in `src/<EXAMPLE_NAME>.py`.
11 changes: 11 additions & 0 deletions examples/complex_model/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
# Complex model

This example shows how you can build and run an arbitrarily complex AI model - much like you can in PyTorch!

The model architecture is defined in `src/my_model.py`. You will notice that it is syntactically nearly identical to the equivalent PyTorch model.

This model is then used in the main Nada program - defined in `src/complex_model.py`. What this script does is simply:
- Load the model provided by Party0 via `my_model = MyModel()` and `my_model.load_state_from_network("my_model", parties[0], na.SecretRational)`
- Load in the (3, 4, 3) input data matrix called "my_input" provided by Party1 via `na.array((3, 4, 3), parties[1], "my_input", na.SecretRational)`
- Run inference via `result = my_model(my_input)`
- Return the inference result to Party1 via `return result.output(parties[1], "my_output")`
Original file line number Diff line number Diff line change
@@ -1,24 +1,19 @@
import asyncio
import os
import sys
import time

import nada_algebra as na
import nada_numpy as na
import nada_numpy.client as na_client
import numpy as np
import py_nillion_client as nillion
import torch
from dotenv import load_dotenv

from nada_ai.client import TorchClient

# Add the parent directory to the system path to import modules from it
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), "../..")))

import nada_algebra.client as na_client
# Import helper functions for creating nillion client and getting keys
from nillion_python_helpers import (create_nillion_client, getNodeKeyFromFile,
getUserKeyFromFile)

from nada_ai.client import TorchClient

# Load environment variables from a .env file
load_dotenv()

Expand Down Expand Up @@ -94,7 +89,7 @@ async def main():
party_id = client.party_id
user_id = client.user_id
party_names = na_client.parties(2)
program_name = "main"
program_name = "complex_model"
program_mir_path = f"./target/{program_name}.nada.bin"

if not os.path.exists("bench"):
Expand Down
4 changes: 2 additions & 2 deletions examples/complex_model/src/complex_model.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
import nada_algebra as na
import nada_numpy as na
from my_model import MyModel


def nada_main():
# Step 1: We use Nada Algebra wrapper to create "Party0" and "Party1"
# Step 1: We use Nada NumPy wrapper to create "Party0" and "Party1"
parties = na.parties(2)

# Step 2: Instantiate model object
Expand Down
4 changes: 3 additions & 1 deletion examples/complex_model/src/my_model.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import nada_algebra as na
import nada_numpy as na

from nada_ai import nn

Expand All @@ -8,6 +8,7 @@ class MyConvModule(nn.Module):

def __init__(self) -> None:
"""Contains some ConvNet components"""
super().__init__()
self.conv = nn.Conv2d(kernel_size=2, in_channels=3, out_channels=2)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note: this is not strictly mandatory for us but it is in torch. so just copied this here so that the syntax is exactly equal

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense, but at the same time I think having a direct coupling makes it much easier.

Setting this as a reminder to update the Nillion Docs on this regard.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wdym by this?

self.pool = nn.AvgPool2d(kernel_size=2, stride=1)

Expand All @@ -29,6 +30,7 @@ class MyModel(nn.Module):

def __init__(self) -> None:
"""Model is a collection of arbitrary custom components"""
super().__init__()
self.conv_module = MyConvModule()
self.my_operations = MyOperations()
self.linear = nn.Linear(4, 2)
Expand Down
11 changes: 11 additions & 0 deletions examples/linear_regression/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
# Linear regression

This example shows how you can run a linear regression model using Nada AI. It highlights that, although there exists a major parallel between Nada AI's design and that of PyTorch, it also integrates with other frameworks such as in this case `sci-kitlearn`

You will find the nada program in `src/linear_regression.py`

What this script does is simply:
- Load the model provided by Party0 via `my_model = LinearRegression(in_features=10)` and `my_model.load_state_from_network("my_model", parties[0], na.SecretRational)`
- Load in the 10 input features as a 1-d array called "my_input" provided by Party1 via `my_input = na.array((10,), parties[1], "my_input", na.SecretRational)`
- Run inference via `result = my_model.forward(my_input)`
- Return the inference result to Party1 via `return [Output(result.value, "my_output", parties[1])]`
Original file line number Diff line number Diff line change
@@ -1,23 +1,18 @@
import asyncio
import os
import sys
import time

import nada_algebra as na
import nada_numpy as na
import nada_numpy.client as na_client
import numpy as np
import py_nillion_client as nillion
from dotenv import load_dotenv
from sklearn.linear_model import LinearRegression

from nada_ai.client import SklearnClient

# Add the parent directory to the system path to import modules from it
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), "../..")))

import nada_algebra.client as na_client
# Import helper functions for creating nillion client and getting keys
from nillion_python_helpers import (create_nillion_client, getNodeKeyFromFile,
getUserKeyFromFile)
from sklearn.linear_model import LinearRegression

from nada_ai.client import SklearnClient

# Load environment variables from a .env file
load_dotenv()
Expand Down Expand Up @@ -96,7 +91,7 @@ async def main():
party_id = client.party_id
user_id = client.user_id
party_names = na_client.parties(2)
program_name = "main"
program_name = "linear_regression"
program_mir_path = f"./target/{program_name}.nada.bin"

if not os.path.exists("bench"):
Expand Down
9 changes: 5 additions & 4 deletions examples/linear_regression/src/linear_regression.py
Original file line number Diff line number Diff line change
@@ -1,14 +1,15 @@
import nada_algebra as na
import nada_numpy as na
from nada_dsl import Output

from nada_ai.linear_model import LinearRegression


def nada_main():
# Step 1: We use Nada Algebra wrapper to create "Party0" and "Party1"
# Step 1: We use Nada NumPy wrapper to create "Party0" and "Party1"
parties = na.parties(2)

# Step 2: Instantiate linear regression object
my_model = LinearRegression(10)
my_model = LinearRegression(in_features=10)

# Step 3: Load model weights from Nillion network by passing model name (acts as ID)
# In this examples Party0 provides the model and Party1 runs inference
Expand All @@ -22,4 +23,4 @@ def nada_main():
result = my_model.forward(my_input)

# Step 6: We can use result.output() to produce the output for Party1 and variable name "my_output"
return result.output(parties[1], "my_output")
return na.output(result, parties[1], "my_output")
4 changes: 2 additions & 2 deletions examples/linear_regression/tests/linear_regression.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,6 @@ inputs:
public_variables: {}
expected_outputs:
# If you go in and crunch the numbers of this one, the result should be 16.877471923828125
# 16.877471923828125 * 2**16 = 1106082
# 16.877471923828125 * 2**16 = 1106085
my_output_0:
SecretInteger: "1106082"
SecretInteger: "1106085"
10 changes: 5 additions & 5 deletions examples/multi_layer_perceptron/01_model_provider.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@
"\n",
"# Using Nada AI model client\n",
"from nada_ai.client import TorchClient\n",
"import nada_algebra as na\n",
"import nada_numpy as na\n",
"import py_nillion_client as nillion\n",
"from nillion_python_helpers import (\n",
" create_nillion_client,\n",
Expand Down Expand Up @@ -615,8 +615,8 @@
" Returns:\n",
" Dict[str, str]: Resulting `action_id` and `program_id`.\n",
" \"\"\"\n",
" action_id = await client.store_program(cluster_id, \"main\", nada_program_path)\n",
" program_id = f\"{user_id}/main\"\n",
" action_id = await client.store_program(cluster_id, \"multi_layer_perceptron\", nada_program_path)\n",
" program_id = f\"{user_id}/multi_layer_perceptron\"\n",
"\n",
" return {\n",
" \"action_id\": action_id,\n",
Expand Down Expand Up @@ -644,7 +644,7 @@
" client=model_provider_client,\n",
" cluster_id=cluster_id,\n",
" user_id=model_provider_user_id,\n",
" nada_program_path=\"target/main.nada.bin\",\n",
" nada_program_path=\"target/multi_layer_perceptron.nada.bin\",\n",
")\n",
"\n",
"action_id = result_store_program[\"action_id\"]\n",
Expand Down Expand Up @@ -804,7 +804,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.3"
"version": "3.12.2"
}
},
"nbformat": 4,
Expand Down
4 changes: 2 additions & 2 deletions examples/multi_layer_perceptron/02_model_inference.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -44,8 +44,8 @@
"from dotenv import load_dotenv\n",
"import numpy as np\n",
"\n",
"import nada_algebra as na\n",
"import nada_algebra.client as na_client\n",
"import nada_numpy as na\n",
"import nada_numpy.client as na_client\n",
"import py_nillion_client as nillion\n",
"from nillion_python_helpers import (\n",
" create_nillion_client,\n",
Expand Down
4 changes: 1 addition & 3 deletions examples/multi_layer_perceptron/README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,6 @@
# Multi-Layer Perceptron Demo
**This folder was generated using `nada init`**

To execute this tutorial, you may potentially need to install the `requirements.txt` apart from nada-ai:
To execute this tutorial, you may need to install the `requirements.txt`:
```bash
pip install -r requirements.txt
```

4 changes: 2 additions & 2 deletions examples/multi_layer_perceptron/nada-project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "text_classification"
name = "multi_layer_perceptron"
version = "0.1.0"
authors = [""]

[[programs]]
path = "src/main.py"
path = "src/multi_layer_perceptron.py"
prime_size = 128
Original file line number Diff line number Diff line change
@@ -1,11 +1,9 @@
"""MLP Nada program"""

import nada_algebra as na
import nada_numpy as na
from my_nn import MyNN


def nada_main():
# Step 1: We use Nada Algebra wrapper to create "Party0" and "Party1"
# Step 1: We use Nada NumPy wrapper to create "Party0" and "Party1"
parties = na.parties(2)

# Step 2: Instantiate model object
Expand Down
3 changes: 2 additions & 1 deletion examples/multi_layer_perceptron/src/my_nn.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import nada_algebra as na
import nada_numpy as na

from nada_ai import nn

Expand All @@ -8,6 +8,7 @@ class MyNN(nn.Module):

def __init__(self) -> None:
"""Model is a two layers and an activations"""
super(MyNN, self).__init__()
# Input size (1, 1, 16, 16) --> Output size (1, 2)
self.conv1 = nn.Conv2d(
in_channels=1, out_channels=2, kernel_size=3, padding=1, stride=4
Expand Down
5 changes: 5 additions & 0 deletions examples/multi_layer_perceptron/target/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# This directory is kept purposely, so that no compilation errors arise.
# Ignore everything in this directory
*
# Except this file
!.gitignore
Loading
Loading