Skip to content

Commit

Permalink
update version related code
Browse files Browse the repository at this point in the history
  • Loading branch information
MAfarrag committed Jan 23, 2025
1 parent dd47d3c commit ecea0cd
Show file tree
Hide file tree
Showing 4 changed files with 8 additions and 8 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ Installing llama-utils
Installing `llama-utils` from the `conda-forge` channel can be achieved by:

```
conda install -c conda-forge llama-utils=0.1.0
conda install -c conda-forge llama-utils=0.2.0
```

It is possible to list all the versions of `llama-utils` available on your platform with:
Expand All @@ -79,7 +79,7 @@ pip install git+https://github.com/Serapieum-of-alex/llama-utils
to install the last release, you can easily use pip

```
pip install llama-utils==0.1.0
pip install llama-utils==0.2.0
```

Quick start
Expand Down
5 changes: 3 additions & 2 deletions docs/change-log.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
- initial design


# 0.2.0 (2025-**-**)
# 0.2.0 (2025-01-24)
## Dev
- setup mkdocs
- setup mkdocs
- take llm models and embedding models dependencies out of the main package dependencies.
5 changes: 2 additions & 3 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ Installing llama-utils
Installing `llama-utils` from the `conda-forge` channel can be achieved by:

```
conda install -c conda-forge llama-utils=0.1.0
conda install -c conda-forge llama-utils=0.2.0
```

It is possible to list all the versions of `llama-utils` available on your platform with:
Expand All @@ -77,7 +77,7 @@ pip install git+https://github.com/Serapieum-of-alex/llama-utils
to install the last release, you can easily use pip

```
pip install llama-utils==0.1.0
pip install llama-utils==0.2.0
```

Quick start
Expand Down Expand Up @@ -142,4 +142,3 @@ time=2024-11-22T23:20:04.592+01:00 level=INFO source=types.go:123 msg="inference

you can change the port by running the following command
`ollama serve --port 11435`

2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[tool.poetry]
name = "llama-utils"
version = "0.1.0"
version = "0.2.0"
description = "LLM utilities for the Llama project"
authors = ["Mostafa Farrag <[email protected]>"]
readme = "README.md"
Expand Down

0 comments on commit ecea0cd

Please sign in to comment.