Skip to content

Commit

Permalink
v0.4.0 (#133)
Browse files Browse the repository at this point in the history
- pip wheel CI release integration
- DRAM legend redesign to better support long buffer list
- Tensor deallocation report in operation details (inputs/outputs only)
- Memory leaks in operation details  (inputs/outputs only)
- CBs timeline view
- Allow multiple instances of visualizer to run simultaneously
- Annotate CBs on plot
- Clear demarcation of CBs vs tensors and improved CBs plot output
hinting
- Improved ssh connection related error messaging
- Improved UI for remote connection
- Backwards compatibility with old data format
- Fix for negative memory fragmentation records
- Fix for deallocation lookup logic
- Fix for ssh report list api generating 500
- Fix for L1 plot becoming randomly unresponsive
  • Loading branch information
aidemsined authored Oct 2, 2024
2 parents 04f5945 + 82c7d11 commit c2aa5ff
Show file tree
Hide file tree
Showing 32 changed files with 829 additions and 471 deletions.
20 changes: 11 additions & 9 deletions .github/workflows/build-wheels.yml
Original file line number Diff line number Diff line change
Expand Up @@ -60,22 +60,24 @@ jobs:
- name: get-npm-version
id: package-version
uses: martinbeentjes/[email protected]

- name: Get wheel file name
id: get_wheel
run: |
- name: Get wheel file name
id: get_wheel
run: |
# Find the .whl file and store its name in a variable
FILE=$(find /home/runner/work/ttnn-visualizer/ -name "*.whl" -type f)
WHEEL_FILE_NAME=$(basename $FILE)
echo "Found wheel file: $FILE"
# Set output to the found file name
echo "wheel_name=$FILE" >> $GITHUB_ENV
echo "wheel_file_name=$WHEEL_FILE_NAME" >> $GITHUB_ENV
- name: Upload Wheel to Release
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ github.event.release.upload_url }}
asset_path: ${{ env.wheel_name }}
asset_name: ttnn-visualizer-${{ github.event.release.tag_name }}.whl
asset_content_type: application/octet-stream
upload_url: ${{ github.event.release.upload_url }}
asset_path: ${{ env.wheel_name }}
asset_name: ${{env.wheel_file_name}}
asset_content_type: application/octet-stream
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -42,3 +42,6 @@ backend/ttnn_visualizer/static
backend/*egg-info
build
.env


sessions.db
33 changes: 26 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,33 @@

A tool for visualizing the Tenstorrent Neural Network (TTNN) model.

- [Running Application](#running-application)
+ [Installing as Wheel](#installing-as-wheel)
+ [Downloading Docker Image](#downloading-docker-image)
- [Running Image](#running-image)
+ [SSH](#ssh)

- [Contributing](#contributing)
* [React + TypeScript + Vite](#react-typescript-vite)
* [Expanding the ESLint configuration](#expanding-the-eslint-configuration)
* [Environment ](#environment)
* [Frontend](#frontend)
* [Backend](#backend)
* [Development](#development)
+ [Fix for python random errors not finding modules:](#fix-for-python-random-errors-not-finding-modules)
+ [Fix for missing distutils package](#fix-for-missing-distutils-package)
* [Docker](#docker)
+ [Running project](#running-project)

## Running Application


### Installing as Wheel

Download the wheel file from the [releases page](https://github.com/tenstorrent/ttnn-visualizer/releases) and install using `pip install release_name.whl`. After installation
simply run `ttnn-visualizer` to start the application.


### Downloading Docker Image

Before executing the command below please see the note on SSH agent configuration.
Expand Down Expand Up @@ -142,7 +167,7 @@ source myenv/bin/activate
install requirements

```shell
pip install -r backend/requirements.txt
pip install -r backend/ttnn_visualizer/requirements.txt
```

Starting the server
Expand Down Expand Up @@ -189,9 +214,3 @@ To run the application you can simply run `docker-compose up web`. To rebuild ad
To use the [provided SSH container](./docker/SSH/README.md) with the compose configuration you can substitute `web` in the above commands for `ssh`. To run the container in the background use `docker-compose up ssh -d`

To connect to this container through the remote connection manager you use the name of the service (`ssh`) as the 'host' and the default SSH port 22.

### Installing as Wheel

Download the wheel file from the [releases page]() and install using `pip install release_name.whl`. After installation
simply run `ttnn-visualizer` to start the application.

25 changes: 8 additions & 17 deletions backend/ttnn_visualizer/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
from flask_cors import CORS
from ttnn_visualizer import settings
from dotenv import load_dotenv
from ttnn_visualizer.database import create_update_database
from ttnn_visualizer.sessions import init_sessions, CustomRequest, init_session_db


def create_app(settings_override=None):
Expand All @@ -31,6 +31,7 @@ def create_app(settings_override=None):
flask_env = environ.get("FLASK_ENV", "development")

app = Flask(__name__, static_folder=static_assets_dir, static_url_path="/")
app.request_class = CustomRequest

app.config.from_object(getattr(settings, flask_env))

Expand All @@ -41,22 +42,12 @@ def create_app(settings_override=None):
if settings_override:
app.config.update(settings_override)

init_session_db()

middleware(app)

app.register_blueprint(api)

# Ensure there is always a schema to reference
# In the future we can probabably re-init the DB or
# wait for initialization until the user has provided a DB
ACTIVE_DATA_DIRECTORY = app.config["ACTIVE_DATA_DIRECTORY"]

active_db_path = Path(ACTIVE_DATA_DIRECTORY, "db.sqlite")
active_db_path.parent.mkdir(exist_ok=True, parents=True)
empty_db_path = Path(__file__).parent.resolve().joinpath("empty.sqlite")

if not active_db_path.exists():
shutil.copy(empty_db_path, active_db_path)

extensions(app)

if flask_env == "production":
Expand Down Expand Up @@ -103,13 +94,13 @@ def middleware(app: flask.Flask):
app.wsgi_app = ProxyFix(app.wsgi_app)

# CORS configuration
origins = ["http://localhost:5173"]
origins = ["http://localhost:5173", "http://localhost:8000"]

init_sessions(app)

CORS(
app,
origins=origins,
allow_headers="*",
methods="*",
supports_credentials=True,
)

return None
3 changes: 2 additions & 1 deletion backend/ttnn_visualizer/queries.py
Original file line number Diff line number Diff line change
Expand Up @@ -200,7 +200,8 @@ def query_next_buffer(cursor, operation_id, address):
"""
cursor.execute(query, (address, operation_id))
row = cursor.fetchone()
return Buffer(*row)
if row:
return Buffer(*row)


def query_device_operations_by_operation_id(cursor, operation_id):
Expand Down
31 changes: 23 additions & 8 deletions backend/ttnn_visualizer/remotes.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,32 +60,38 @@ def remote_handler(*args, **kwargs):
try:
return func(*args, **kwargs)
except AuthenticationException as err:
logger.error(f"Unable to authenticate : {str(err)}")
raise RemoteFolderException(
status=403, message=f"Unable to authenticate: {str(err)}"
)
except FileNotFoundError as err:
logger.error(f"Unable to open {connection.path}: {str(err)}")
raise RemoteFolderException(
status=500, message=f"Unable to open path {connection.path}: {str(err)}"
status=400, message=f"Unable to open path {connection.path}"
)
except NoProjectsException as err:
logger.error(f"No projects at {connection.path}: {str(err)}")
raise RemoteFolderException(
status=400,
message=f"No projects found at remote location: {connection.path}: {str(err)}",
message=f"No projects found at remote location: {connection.path}",
)
except NoValidConnectionsError as err:
logger.error(f"Unable to connect to host: {str(err)}")
raise RemoteFolderException(
status=500,
message=f"Unable to connect to host {connection.host}: {str(err)}",
message=f"Unable to connect to host {connection.host}",
)

except IOError as err:
logger.error(f"Unable opening remote file: {str(err)}")
raise RemoteFolderException(
status=400,
message=f"Error opening remote folder {connection.path}: {str(err)}",
message=f"Error opening remote folder {connection.path}",
)
except SSHException as err:
logger.error(f"Unable to connect to host: {str(err)}")
raise RemoteFolderException(
status=500, message=f"Error connecting to host {connection.host}: {err}"
status=500, message=f"Error connecting to host {connection.host}"
)

return remote_handler
Expand Down Expand Up @@ -251,7 +257,9 @@ def sftp_walk(sftp, remote_path):


@remote_exception_handler
def sync_test_folders(remote_connection: RemoteConnection, remote_folder: RemoteFolder):
def sync_test_folders(
remote_connection: RemoteConnection, remote_folder: RemoteFolder, path_prefix: str
):
"""
Synchronize remote test folders to local storage
Remote folders will be synchronized to REPORT_DATA_DIR
Expand All @@ -263,14 +271,21 @@ def sync_test_folders(remote_connection: RemoteConnection, remote_folder: Remote
with client.open_sftp() as sftp:
report_folder = Path(remote_folder.remotePath).name
destination_dir = Path(
REPORT_DATA_DIRECTORY, remote_connection.name, report_folder
REPORT_DATA_DIRECTORY,
path_prefix,
remote_connection.host,
report_folder,
)
if not Path(destination_dir).exists():
Path(destination_dir).mkdir(parents=True, exist_ok=True)
for directory, files in sftp_walk(sftp, remote_folder.remotePath):
sftp.chdir(str(directory))
for file in files:
sftp.get(file, str(Path(destination_dir, file)))
output_file = Path(destination_dir, file)
logger.info(
f"Writing file: {str(output_file.parent.name)}/{str(output_file.name)}"
)
sftp.get(file, str(output_file))


@remote_exception_handler
Expand Down
7 changes: 4 additions & 3 deletions backend/ttnn_visualizer/serializers.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,10 @@ def serialize_operations(

device_operations_dict = dict()
for device_operation in device_operations:
device_operations_dict.update(
{device_operation.operation_id: device_operation.captured_graph}
)
if hasattr(device_operation, "operation_id"):
device_operations_dict.update(
{device_operation.operation_id: device_operation.captured_graph}
)

stack_traces_dict = defaultdict(str)
for stack_trace in stack_traces:
Expand Down
Loading

0 comments on commit c2aa5ff

Please sign in to comment.