Skip to content

Commit

Permalink
Remove OpenAI dependency (#183)
Browse files Browse the repository at this point in the history
  • Loading branch information
dnandakumar-nv authored Sep 19, 2024
1 parent 9720d61 commit a9b872f
Show file tree
Hide file tree
Showing 9 changed files with 284 additions and 830 deletions.
3 changes: 3 additions & 0 deletions community/event-driven-rag-cve-analysis/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -66,5 +66,8 @@ RUN source activate morpheus &&\
jupyter contrib nbextension install --user &&\
pip install jupyterlab_nvdashboard==0.9

RUN source activate morpheus &&\
pip install --upgrade langchain-nvidia-ai-endpoints

# Launch jupyter
CMD ["jupyter-lab", "--ip=0.0.0.0", "--no-browser", "--allow-root"]
10 changes: 3 additions & 7 deletions community/event-driven-rag-cve-analysis/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,11 +28,7 @@ You will also need to have a `Morpheus 24.03` docker container built and present

### NVIDIA GPU Cloud

To access the NVIDIA hosted Inference Service, you will need to have the following environment variables set: `OPENAI_API_KEY`. To obtain the API key, please visit the [NVIDIA website](https://build.nvidia.com/) for instructions on generating your API key.

It's important to note here that although we store the NGC API Key under the `OPENAI_API_KEY` variable, we will be interacting with NVIDIA hosted LLMs and not OpenAI LLMs.

NVIDIA NIM microservices are OpenAI API compliant to maximize usability, so we will be using the `openai` with package as a wrapped to make API calls.
To access the NVIDIA hosted Inference Service, you will need to have the following environment variables set: `NVIDIA_API_KEY`. To obtain the API key, please visit the [NVIDIA website](https://build.nvidia.com/) for instructions on generating your API key.

### Building a Morpheus Container

Expand All @@ -53,13 +49,13 @@ If you are using a Morpheus version that is not `v24.03.02-runtime`, please upda
```
### Creating an Environment File

To automatically use these API keys, you can set the `OPENAI_API_KEY` value in the `docker-compose.yml` file in this directory as follows:
To automatically use these API keys, you can set the `NVIDIA_API_KEY` value in the `docker-compose.yml` file in this directory as follows:

```bash
environment:
- TERM=${TERM:-}
# Workaround until this is working: https://github.com/docker/compose/issues/9181#issuecomment-1996016211
- OPENAI_API_KEY=<BUILD_NV_API_KEY>
- NVIDIA_API_KEY=<BUILD_NV_API_KEY>
# Overwrite any environment variables in the .env file with URLs needed in the network
- OPENAI_API_BASE=https://integrate.api.nvidia.com/v1
- OPENAI_BASE_URL=https://integrate.api.nvidia.com/v1
Expand Down
12 changes: 0 additions & 12 deletions community/event-driven-rag-cve-analysis/cyber_dev_day/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,16 +55,6 @@ class NVFoundationLLMModelConfig(BaseModel):
temperature: float = 0.0


class OpenAIServiceConfig(BaseModel):
type: typing.Literal["openai"] = "openai"


class OpenAIMModelConfig(BaseModel):
service: OpenAIServiceConfig

model_name: str


class NIMServiceConfig(BaseModel):
type: typing.Literal["NIM"] = "NIM"

Expand All @@ -73,13 +63,11 @@ class NIMModelConfig(BaseModel):
service: NIMServiceConfig

model_name: str
base_url: str
temperature: float = 0.0
top_p: float = 1


LLMModelConfig = typing.Annotated[typing.Annotated[NeMoLLMModelConfig, Tag("nemo")]
| typing.Annotated[OpenAIMModelConfig, Tag("openai")]
| typing.Annotated[NVFoundationLLMModelConfig, Tag("nvfoundation")]
| typing.Annotated[NIMModelConfig, Tag("NIM")],
Discriminator(_llm_discriminator)]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,7 @@ def create(service_type: str, *service_args, **service_kwargs) -> "LLMService":
pass

@staticmethod
def create(service_type: str | typing.Literal["nemo"] | typing.Literal["openai"], *service_args, **service_kwargs):
def create(service_type: str | typing.Literal["nemo"] | typing.Literal["nim"], *service_args, **service_kwargs):
"""
Returns a service for interacting with LLM models.
Expand Down
Loading

0 comments on commit a9b872f

Please sign in to comment.