From 632c21216e85e119dc7f47b0118ff25535f0a7be Mon Sep 17 00:00:00 2001 From: Yiheng Wang Date: Fri, 28 Feb 2025 22:53:14 +0800 Subject: [PATCH 01/11] add fast inference tutorial Signed-off-by: Yiheng Wang --- acceleration/README.md | 2 + .../fast_inference_tutorial.ipynb | 336 ++++++++++++++++++ acceleration/fast_inference_tutorial/utils.py | 194 ++++++++++ 3 files changed, 532 insertions(+) create mode 100644 acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb create mode 100644 acceleration/fast_inference_tutorial/utils.py diff --git a/acceleration/README.md b/acceleration/README.md index e803b6e445..8c6cf65f89 100644 --- a/acceleration/README.md +++ b/acceleration/README.md @@ -4,6 +4,8 @@ Typically, model training is a time-consuming step during deep learning developm ### List of notebooks and examples #### [fast_model_training_guide](./fast_model_training_guide.md) The document introduces details of how to profile the training pipeline, how to analyze the dataset and select suitable algorithms, and how to optimize GPU utilization in single GPU, multi-GPUs or even multi-nodes. +#### [fast_inference_tutorial](./fast_inference_tutorial) +The example introduces details of how to use GDS, GPU transforms and TensorRT to accelerate the inference. #### [distributed_training](./distributed_training) The examples show how to execute distributed training and evaluation based on 3 different frameworks: - PyTorch native `DistributedDataParallel` module with `torchrun`. diff --git a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb new file mode 100644 index 0000000000..253fb9e40a --- /dev/null +++ b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb @@ -0,0 +1,336 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Copyright (c) MONAI Consortium \n", + "Licensed under the Apache License, Version 2.0 (the \"License\"); \n", + "you may not use this file except in compliance with the License. \n", + "You may obtain a copy of the License at \n", + "    http://www.apache.org/licenses/LICENSE-2.0 \n", + "Unless required by applicable law or agreed to in writing, software \n", + "distributed under the License is distributed on an \"AS IS\" BASIS, \n", + "WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. \n", + "See the License for the specific language governing permissions and \n", + "limitations under the License.\n", + "\n", + "# Fast Inference with MONAI features" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This tutorial demonstrates the performance comparison between a standard PyTorch training program and a MONAI-optimized inference program. The key features include:\n", + "\n", + "1. **Direct Data Loading**: Load data directly from disk to GPU memory, minimizing data transfer time and improving efficiency.\n", + "2. **GPU-based Preprocessing**: Execute preprocessing transforms directly on the GPU, leveraging its computational power for faster data preparation.\n", + "3. **TensorRT Inference**: Utilize TensorRT for running inference, which optimizes the model for high-performance execution on NVIDIA GPUs.\n", + "\n", + "This tutorial is modified from the `TensorRT_inference_acceleration` tutorial." + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Setup environment" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Loading data directly from disk to GPU memory requires the `kvikio` library. In addition, this tutorial requires many other dependencies such as `monai`, `torch`, `torch_tensorrt`, `numpy`, `ignite`, `pandas`, `matplotlib`, etc. We recommend using the [MONAI Docker](https://docs.monai.io/en/latest/installation.html#from-dockerhub) image to run this tutorial, which includes pre-configured dependencies and allows you to skip manual installation.\n", + "\n", + "If not using MONAI Docker, install `kvikio` using one of these methods:\n", + "\n", + "- **PyPI Installation** \n", + " Use the appropriate package for your CUDA version:\n", + " ```bash\n", + " pip install kvikio-cu12 # For CUDA 12\n", + " pip install kvikio-cu11 # For CUDA 11\n", + " ```\n", + "\n", + "- **Conda/Mamba Installation** \n", + " Follow the official [KvikIO installation guide](https://docs.rapids.ai/api/kvikio/nightly/install/) for Conda/Mamba installations.\n", + "\n", + "For convenience, we provide the cell below to install all the dependencies (please modify the cell based on your actual CUDA version, and please note that only CUDA 11 and CUDA 12 are supported for now)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "!python -c \"import monai\" || pip install -q \"monai-weekly[nibabel, pydicom, tqdm]\"\n", + "!python -c \"import matplotlib\" || pip install -q matplotlib\n", + "!python -c \"import torch_tensorrt\" || pip install torch_tensorrt\n", + "!python -c \"import kvikio\" || pip install kvikio-cu12\n", + "!python -c \"import ignite\" || pip install pytorch-ignite\n", + "!python -c \"import pandas\" || pip install pandas\n", + "!python -c \"import requests\" || pip install requests\n", + "!python -c \"import fire\" || pip install fire\n", + "!python -c \"import onnx\" || pip install nibaonnxbel\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Setup imports" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import os\n", + "\n", + "import torch\n", + "import torch_tensorrt\n", + "import matplotlib.pyplot as plt\n", + "import monai\n", + "from monai.config import print_config\n", + "from monai.transforms import (\n", + " EnsureChannelFirstd,\n", + " EnsureTyped,\n", + " LoadImaged,\n", + " Orientationd,\n", + " Spacingd,\n", + " ScaleIntensityRanged,\n", + " Compose\n", + ")\n", + "from monai.data import Dataset,ThreadDataLoader\n", + "import torch\n", + "import numpy as np\n", + "import copy\n", + "\n", + "print(f\"Torch-TensorRT version: {torch_tensorrt.__version__}.\")\n", + "\n", + "print_config()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Prepare Test Data, Bundle, and TensorRT Model\n", + "\n", + "We provide a helper script, [`prepare_data.py`](./prepare_data.py), to simplify the setup process. This script performs the following tasks:\n", + "\n", + "- **Test Data**: Downloads and extracts the [Medical Segmentation Decathlon Task09 Spleen dataset](http://medicaldecathlon.com/).\n", + "- **Bundle**: Downloads the required `spleen_ct_segmentation` bundle.\n", + "- **TensorRT Model**: Exports the downloaded bundle model to a TensorRT engine-based TorchScript model. By default, the script exports the model using `fp16` precision, but you can modify it to use `fp32` precision if desired.\n", + "\n", + "The script automatically checks for existing data, bundles, and exported models before downloading or exporting. This ensures that repeated executions of the notebook do not result in redundant operations." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from utils import prepare_test_datalist, prepare_test_bundle, prepare_tensorrt_model\n", + "\n", + "root_dir = \".\"\n", + "\n", + "train_files = prepare_test_datalist(root_dir)\n", + "bundle_path = prepare_test_bundle(bundle_dir=root_dir, bundle_name=\"spleen_ct_segmentation\")\n", + "trt_model_name = \"model_trt.ts\"\n", + "prepare_tensorrt_model(bundle_path, trt_model_name)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Benchmark the end-to-end bundle inference\n", + "\n", + "A variable `benchmark_type` is defined to specify the type of benchmark to run. To have a fair comparison, each benchmark type should be run after restarting the notebook kernel.\n", + "\n", + "`benchmark_type` can be one of the following:\n", + "\n", + "- `\"original\"`: benchmark the original bundle inference.\n", + "- `\"trt\"`: benchmark the TensorRT accelerated bundle inference.\n", + "- `\"trt_gds\"`: benchmark the TensorRT accelerated bundle inference with GPU data loading and GPU transforms." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [], + "source": [ + "benchmark_type = \"trt_gds\"" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A `TimerHandler` is defined to benchmark every part of the inference process.\n", + "\n", + "Please refer to `utils.py` for the implementation of `CUDATimer` and `TimerHandler`." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [], + "source": [ + "from utils import TimerHandler, prepare_workflow, benchmark_workflow" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Benchmark the Original Bundle Inference\n", + "\n", + "In this section, the `workflow`runs several iterations to benchmark the latency." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "model_weight = os.path.join(bundle_path, \"models\", \"model.pt\")\n", + "meta_config = os.path.join(bundle_path, \"configs\", \"metadata.json\")\n", + "inference_config = os.path.join(bundle_path, \"configs\", \"inference.json\")\n", + "\n", + "override = {\n", + " \"dataset#data\": [{\"image\": i} for i in train_files],\n", + " \"output_postfix\": benchmark_type,\n", + "}" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [], + "source": [ + "if benchmark_type == \"original\":\n", + "\n", + " workflow = prepare_workflow(inference_config, meta_config, bundle_path, override)\n", + " torch_timer = TimerHandler()\n", + " benchmark_df = benchmark_workflow(workflow, torch_timer, benchmark_type)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Benchmark the TensorRT Accelerated Bundle Inference\n", + "In this part, the TensorRT accelerated model is loaded to the `workflow`. The updated `workflow` runs the same iterations as before to benchmark the latency difference. Since the TensorRT accelerated model cannot be loaded through the `CheckpointLoader` and don't have `amp` mode, disable the `CheckpointLoader` in the `initialize` of the `workflow` and the `amp` parameter in the `evaluator` of the `workflow` needs to be set to `False`." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [], + "source": [ + "if benchmark_type == \"trt\":\n", + " trt_model_path = os.path.join(bundle_path, \"models\", \"model_trt.ts\")\n", + " trt_model = torch.jit.load(trt_model_path)\n", + "\n", + " override[\"load_pretrain\"] = False\n", + " override[\"network_def\"] = trt_model\n", + " override[\"evaluator#amp\"] = False\n", + "\n", + " workflow = prepare_workflow(inference_config, meta_config, bundle_path, override)\n", + " trt_timer = TimerHandler()\n", + " benchmark_df = benchmark_workflow(workflow, trt_timer, benchmark_type)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Benchmarking TensorRT Accelerated Bundle Inference with GPU Data Loading and GPU-based Transforms\n", + "\n", + "In the previous section, the inference workflow utilized CPU-based transforms. In this section, we enhance performance by leveraging GPU acceleration:\n", + "\n", + "- **GPU Direct Storage (GDS)**: The `LoadImaged` transform enables GDS on `.nii` and `.dcm` files via specifying `to_gpu=True`.\n", + "- **GPU-based Transforms**: After GDS, subsequent preprocessing transforms are executed directly on the GPU." + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [], + "source": [ + "transforms = Compose([\n", + " LoadImaged(keys=\"image\", reader=\"NibabelReader\", to_gpu=False),\n", + " EnsureTyped(keys=\"image\", device=torch.device(\"cuda:0\")),\n", + " EnsureChannelFirstd(keys=\"image\"),\n", + " Orientationd(keys=\"image\", axcodes=\"RAS\"),\n", + " Spacingd(keys=\"image\", pixdim=[1.5, 1.5, 2.0], mode=\"bilinear\"),\n", + " ScaleIntensityRanged(keys=\"image\", a_min=-57, a_max=164, b_min=0, b_max=1, clip=True),\n", + "])\n", + "\n", + "dataset = Dataset(data=[{\"image\": i} for i in train_files], transform=transforms)\n", + "dataloader = ThreadDataLoader(dataset, batch_size=1, shuffle=False, num_workers=0)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "if benchmark_type == \"trt_gds\":\n", + "\n", + " trt_model_path = os.path.join(bundle_path, \"models\", \"model_trt.ts\")\n", + " trt_model = torch.jit.load(trt_model_path)\n", + " override = {\n", + " \"output_postfix\": benchmark_type,\n", + " \"load_pretrain\": False,\n", + " \"network_def\": trt_model,\n", + " \"evaluator#amp\": False,\n", + " \"preprocessing\": transforms,\n", + " \"dataset\": dataset,\n", + " \"dataloader\": dataloader,\n", + " }\n", + "\n", + " workflow = prepare_workflow(inference_config, meta_config, bundle_path, override)\n", + " trt_gpu_trans_timer = TimerHandler()\n", + " benchmark_df = benchmark_workflow(workflow, trt_gpu_trans_timer, benchmark_type)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "kvikio_env", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.14" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/acceleration/fast_inference_tutorial/utils.py b/acceleration/fast_inference_tutorial/utils.py new file mode 100644 index 0000000000..372f05a906 --- /dev/null +++ b/acceleration/fast_inference_tutorial/utils.py @@ -0,0 +1,194 @@ +# Copyright (c) MONAI Consortium +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# http://www.apache.org/licenses/LICENSE-2.0 +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + + +import os +import glob +import shutil +import monai +import pandas as pd +import numpy as np +from collections import OrderedDict +import torch +from ignite.engine import Engine +from ignite.engine import Events +from monai.engines import IterationEvents +from monai.bundle import trt_export +from monai.apps import download_and_extract + + +def prepare_test_datalist(root_dir): + resource = "https://msd-for-monai.s3-us-west-2.amazonaws.com/Task09_Spleen.tar" + md5 = "410d4a301da4e5b2f6f86ec3ddba524e" + + compressed_file = os.path.join(root_dir, "Task09_Spleen.tar") + data_root = os.path.join(root_dir, "Task09_Spleen") + if not os.path.exists(data_root): + download_and_extract(resource, compressed_file, root_dir, md5) + + nii_dir = os.path.join(data_root, "imagesTr_nii") + if not os.path.exists(nii_dir): + os.makedirs(nii_dir, exist_ok=True) + train_gz_files = sorted(glob.glob(os.path.join(data_root, "imagesTr", "*.nii.gz"))) + for file in train_gz_files: + new_file = file.replace(".nii.gz", ".nii") + if not os.path.exists(new_file): + os.system(f"gzip -dc {file} > {new_file}") + shutil.copy(new_file, nii_dir) + else: + print(f"Test data already exists at {nii_dir}") + + train_files = sorted(glob.glob(os.path.join(nii_dir, "*.nii"))) + return train_files + + +def prepare_test_bundle(bundle_dir, bundle_name="spleen_ct_segmentation"): + bundle_path = os.path.join(bundle_dir, bundle_name) + if not os.path.exists(bundle_path): + monai.bundle.download(name=bundle_name, bundle_dir=bundle_dir) + else: + print(f"Bundle already exists at {bundle_path}") + return bundle_path + + +def prepare_tensorrt_model(bundle_path, trt_model_name="model_trt.ts"): + output_path = os.path.join(bundle_path, "models", trt_model_name) + if not os.path.exists(output_path): + trt_export( + net_id="network_def", + filepath=output_path, + ckpt_file=os.path.join(bundle_path, "models", "model.pt"), + meta_file=os.path.join(bundle_path, "configs", "metadata.json"), + config_file=os.path.join(bundle_path, "configs", "inference.json"), + precision="fp16", + dynamic_batchsize=[1, 4, 8], + use_onnx=True, + use_trace=True + ) + else: + print(f"TensorRT model already exists at {output_path}") + + +class CUDATimer: + def __init__(self, type_str) -> None: + self.time_list = [] + self.type_str = type_str + + def start(self) -> None: + self.starter = torch.cuda.Event(enable_timing=True) + self.ender = torch.cuda.Event(enable_timing=True) + torch.cuda.synchronize() + self.starter.record() + + def end(self) -> None: + self.ender.record() + torch.cuda.synchronize() + self.time_list.append(self.starter.elapsed_time(self.ender)) + + def get_max(self) -> float: + return max(self.time_list) + + def get_min(self) -> float: + return min(self.time_list) + + def get_mean(self) -> float: + np_time = np.array(self.time_list) + return np.mean(np_time) + + def get_std(self) -> float: + np_time = np.array(self.time_list) + return np.std(np_time) + + def get_sum(self) -> float: + np_time = np.array(self.time_list) + return np.sum(np_time) + + def get_results_dict(self) -> OrderedDict: + out_list = [ + ("total", self.get_sum()), + ("min", self.get_min()), + ("max", self.get_max()), + ("mean", self.get_mean()), + ("std", self.get_std()), + ] + return OrderedDict(out_list) + + +class TimerHandler: + def __init__(self) -> None: + self.run_timer = CUDATimer("RUN") + self.epoch_timer = CUDATimer("EPOCH") + self.iteration_timer = CUDATimer("ITERATION") + self.net_forward_timer = CUDATimer("FORWARD") + self.get_batch_timer = CUDATimer("PREPARE_BATCH") + self.post_process_timer = CUDATimer("POST_PROCESS") + self.timer_list = [ + self.run_timer, + self.epoch_timer, + self.iteration_timer, + self.net_forward_timer, + self.get_batch_timer, + self.post_process_timer, + ] + + def attach(self, engine: Engine) -> None: + engine.add_event_handler(Events.STARTED, self.started, timer=self.run_timer) + engine.add_event_handler(Events.EPOCH_STARTED, self.started, timer=self.epoch_timer) + engine.add_event_handler(Events.ITERATION_STARTED, self.started, timer=self.iteration_timer) + engine.add_event_handler(Events.GET_BATCH_STARTED, self.started, timer=self.get_batch_timer) + engine.add_event_handler(Events.GET_BATCH_COMPLETED, self.completed, timer=self.get_batch_timer) + engine.add_event_handler(Events.GET_BATCH_COMPLETED, self.started, timer=self.net_forward_timer) + engine.add_event_handler(IterationEvents.FORWARD_COMPLETED, self.completed, timer=self.net_forward_timer) + engine.add_event_handler(IterationEvents.FORWARD_COMPLETED, self.started, timer=self.post_process_timer) + engine.add_event_handler(Events.ITERATION_COMPLETED, self.completed, timer=self.post_process_timer) + engine.add_event_handler(Events.ITERATION_COMPLETED, self.completed, timer=self.iteration_timer) + engine.add_event_handler(Events.EPOCH_COMPLETED, self.completed, timer=self.epoch_timer) + engine.add_event_handler(Events.COMPLETED, self.completed, timer=self.run_timer) + + def started(self, engine: Engine, timer: CUDATimer) -> None: + timer.start() + + def completed(self, engine: Engine, timer: CUDATimer) -> None: + timer.end() + + def print_results(self) -> None: + index = [x.type_str for x in self.timer_list] + column_title = list(self.timer_list[0].get_results_dict().keys()) + column_title = [x + "/ms" for x in column_title] + latency_list = [x for timer in self.timer_list for x in timer.get_results_dict().values()] + latency_array = np.array(latency_list) + latency_array = np.reshape(latency_array, (len(index), len(column_title))) + df = pd.DataFrame(latency_array, index=index, columns=column_title) + return df + + +def prepare_workflow(inference_config, meta_config, bundle_path, override): + workflow = monai.bundle.ConfigWorkflow( + workflow="infer", + config_file=inference_config, + meta_file=meta_config, + logging_file=os.path.join(bundle_path, "configs", "logging.conf"), + bundle_root=bundle_path, + **override, + ) + + return workflow + +def benchmark_workflow(workflow, timer, benchmark_type): + workflow.initialize() + timer.attach(workflow.evaluator) + workflow.run() + workflow.finalize() + + benchmark_df = timer.print_results() + benchmark_df.to_csv(f"benchmark_{benchmark_type}.csv") + + return benchmark_df From d2873ec10446f1bb2dbb91bd1157f4abc46df008 Mon Sep 17 00:00:00 2001 From: "pre-commit-ci[bot]" <66853113+pre-commit-ci[bot]@users.noreply.github.com> Date: Fri, 28 Feb 2025 14:54:43 +0000 Subject: [PATCH 02/11] [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci --- .../fast_inference_tutorial.ipynb | 22 ++++++++++--------- acceleration/fast_inference_tutorial/utils.py | 3 ++- 2 files changed, 14 insertions(+), 11 deletions(-) diff --git a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb index 253fb9e40a..4faafb1538 100644 --- a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb +++ b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb @@ -105,9 +105,9 @@ " Orientationd,\n", " Spacingd,\n", " ScaleIntensityRanged,\n", - " Compose\n", + " Compose,\n", ")\n", - "from monai.data import Dataset,ThreadDataLoader\n", + "from monai.data import Dataset, ThreadDataLoader\n", "import torch\n", "import numpy as np\n", "import copy\n", @@ -273,14 +273,16 @@ "metadata": {}, "outputs": [], "source": [ - "transforms = Compose([\n", - " LoadImaged(keys=\"image\", reader=\"NibabelReader\", to_gpu=False),\n", - " EnsureTyped(keys=\"image\", device=torch.device(\"cuda:0\")),\n", - " EnsureChannelFirstd(keys=\"image\"),\n", - " Orientationd(keys=\"image\", axcodes=\"RAS\"),\n", - " Spacingd(keys=\"image\", pixdim=[1.5, 1.5, 2.0], mode=\"bilinear\"),\n", - " ScaleIntensityRanged(keys=\"image\", a_min=-57, a_max=164, b_min=0, b_max=1, clip=True),\n", - "])\n", + "transforms = Compose(\n", + " [\n", + " LoadImaged(keys=\"image\", reader=\"NibabelReader\", to_gpu=False),\n", + " EnsureTyped(keys=\"image\", device=torch.device(\"cuda:0\")),\n", + " EnsureChannelFirstd(keys=\"image\"),\n", + " Orientationd(keys=\"image\", axcodes=\"RAS\"),\n", + " Spacingd(keys=\"image\", pixdim=[1.5, 1.5, 2.0], mode=\"bilinear\"),\n", + " ScaleIntensityRanged(keys=\"image\", a_min=-57, a_max=164, b_min=0, b_max=1, clip=True),\n", + " ]\n", + ")\n", "\n", "dataset = Dataset(data=[{\"image\": i} for i in train_files], transform=transforms)\n", "dataloader = ThreadDataLoader(dataset, batch_size=1, shuffle=False, num_workers=0)" diff --git a/acceleration/fast_inference_tutorial/utils.py b/acceleration/fast_inference_tutorial/utils.py index 372f05a906..1d45e84493 100644 --- a/acceleration/fast_inference_tutorial/utils.py +++ b/acceleration/fast_inference_tutorial/utils.py @@ -71,7 +71,7 @@ def prepare_tensorrt_model(bundle_path, trt_model_name="model_trt.ts"): precision="fp16", dynamic_batchsize=[1, 4, 8], use_onnx=True, - use_trace=True + use_trace=True, ) else: print(f"TensorRT model already exists at {output_path}") @@ -182,6 +182,7 @@ def prepare_workflow(inference_config, meta_config, bundle_path, override): return workflow + def benchmark_workflow(workflow, timer, benchmark_type): workflow.initialize() timer.attach(workflow.evaluator) From 9f229ea3afeaf8db50ecc537969db46f1700a358 Mon Sep 17 00:00:00 2001 From: Yiheng Wang Date: Fri, 7 Mar 2025 12:51:42 +0800 Subject: [PATCH 03/11] rewrite with liver and whole body ct seg Signed-off-by: Yiheng Wang --- .../fast_inference_tutorial.ipynb | 339 ++++++++++++------ acceleration/fast_inference_tutorial/utils.py | 208 +++-------- 2 files changed, 285 insertions(+), 262 deletions(-) diff --git a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb index 253fb9e40a..ec81c104b7 100644 --- a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb +++ b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb @@ -15,20 +15,22 @@ "See the License for the specific language governing permissions and \n", "limitations under the License.\n", "\n", - "# Fast Inference with MONAI features" + "# Fast Inference with MONAI Features" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "This tutorial demonstrates the performance comparison between a standard PyTorch training program and a MONAI-optimized inference program. The key features include:\n", + "## Accelerating Model Inference with MONAI\n", "\n", - "1. **Direct Data Loading**: Load data directly from disk to GPU memory, minimizing data transfer time and improving efficiency.\n", - "2. **GPU-based Preprocessing**: Execute preprocessing transforms directly on the GPU, leveraging its computational power for faster data preparation.\n", - "3. **TensorRT Inference**: Utilize TensorRT for running inference, which optimizes the model for high-performance execution on NVIDIA GPUs.\n", + "In this tutorial, we explore three powerful features that can accelerate model inference using MONAI. These features are designed to optimize the data handling and computational efficiency of your inference pipeline, particularly when working with NVIDIA GPUs. The tutorial will guide you through the following features and provide a comprehensive benchmarking strategy to evaluate the performance improvements offered by each feature:\n", "\n", - "This tutorial is modified from the `TensorRT_inference_acceleration` tutorial." + "1. **TensorRT Inference**: Utilize NVIDIA's TensorRT to optimize and execute models for high-performance inference on NVIDIA GPUs.\n", + "\n", + "2. **GPU-Based Preprocessing**: Leverage the computational power of GPUs to perform data preprocessing directly on the GPU. This can significantly reduce the time spent on data preparation, enabling faster inference.\n", + "\n", + "3. **Direct GPU Data Loading**: Minimize data transfer times by loading data directly from disk into GPU memory. This feature supports NIfTI and DICOM file formats." ] }, { @@ -36,7 +38,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Setup environment" + "## Install environment" ] }, { @@ -75,6 +77,7 @@ "!python -c \"import requests\" || pip install requests\n", "!python -c \"import fire\" || pip install fire\n", "!python -c \"import onnx\" || pip install nibaonnxbel\n", + "!python -c \"import nvtx\" || pip install nvtx\n", "%matplotlib inline" ] }, @@ -95,8 +98,6 @@ "\n", "import torch\n", "import torch_tensorrt\n", - "import matplotlib.pyplot as plt\n", - "import monai\n", "from monai.config import print_config\n", "from monai.transforms import (\n", " EnsureChannelFirstd,\n", @@ -104,14 +105,20 @@ " LoadImaged,\n", " Orientationd,\n", " Spacingd,\n", - " ScaleIntensityRanged,\n", + " NormalizeIntensityd,\n", + " ScaleIntensityd,\n", + " Invertd,\n", + " Activationsd,\n", + " AsDiscreted,\n", " Compose\n", ")\n", - "from monai.data import Dataset,ThreadDataLoader\n", + "from monai.inferers import sliding_window_inference\n", + "from monai.networks.nets import SegResNet\n", "import torch\n", - "import numpy as np\n", - "import copy\n", + "import pandas as pd\n", + "from timeit import default_timer as timer\n", "\n", + "os.environ[\"CUDA_VISIBLE_DEVICES\"] = \"0\"\n", "print(f\"Torch-TensorRT version: {torch_tensorrt.__version__}.\")\n", "\n", "print_config()" @@ -121,194 +128,314 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Prepare Test Data, Bundle, and TensorRT Model\n", + "## Introduction on Fast Inference Features\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 1. TensorRT Inference\n", "\n", - "We provide a helper script, [`prepare_data.py`](./prepare_data.py), to simplify the setup process. This script performs the following tasks:\n", + "`monai.networks.utils.convert_to_trt` is a function that converts a PyTorch model to a TensorRT engine-based TorchScript model. Except the loading method (need to use `torch.jit.load` to load the model), the usage of the converted TorchScriptmodel is the same as the original model.\n", + "\n", + "`monai.data.torchscript_utils.save_net_with_metadata` is a function that saves the converted TorchScript model and its metadata.\n", + "\n", + "example:\n", + "\n", + "```python\n", + "\n", + "from monai.networks.nets import SegResNet\n", + "from monai.networks.utils import convert_to_trt\n", + "from monai.data.torchscript_utils import save_net_with_metadata\n", + "\n", + "model = SegResNet(\n", + " spatial_dims=3,\n", + " in_channels=1,\n", + " out_channels=105,\n", + " init_filters=32,\n", + " blocks_down=[1, 2, 2, 4],\n", + " blocks_up=[1, 1, 1],\n", + " dropout_prob=0.2,\n", + ")\n", + "weights = torch.load(\"model.pt\")\n", + "model.load_state_dict(weights)\n", + "torchscript_model = convert_to_trt(\n", + " model=model,\n", + " precision=\"fp16\",\n", + " input_shape=[1, 1, 96, 96, 96],\n", + " dynamic_batchsize=[1, 1, 1],\n", + " use_trace=False,\n", + " verify=True,\n", + ")\n", "\n", - "- **Test Data**: Downloads and extracts the [Medical Segmentation Decathlon Task09 Spleen dataset](http://medicaldecathlon.com/).\n", - "- **Bundle**: Downloads the required `spleen_ct_segmentation` bundle.\n", - "- **TensorRT Model**: Exports the downloaded bundle model to a TensorRT engine-based TorchScript model. By default, the script exports the model using `fp16` precision, but you can modify it to use `fp32` precision if desired.\n", + "save_net_with_metadata(torchscript_model, \"segresnet_trt\")\n", "\n", - "The script automatically checks for existing data, bundles, and exported models before downloading or exporting. This ensures that repeated executions of the notebook do not result in redundant operations." + "model = torch.jit.load(\"segresnet_trt.ts\")\n", + "```" ] }, { - "cell_type": "code", - "execution_count": null, + "cell_type": "markdown", "metadata": {}, - "outputs": [], "source": [ - "from utils import prepare_test_datalist, prepare_test_bundle, prepare_tensorrt_model\n", + "### 2. GPU-Based Preprocessing\n", "\n", - "root_dir = \".\"\n", + "`monai.transforms.EnsureTyped` transform allows you to specify the `device` and `dtype` for the output tensor. Therefore, in order to perform GPU-based preprocessing, you can insert the `EnsureTyped` transform at the beginning of your preprocessing transforms. For example:\n", "\n", - "train_files = prepare_test_datalist(root_dir)\n", - "bundle_path = prepare_test_bundle(bundle_dir=root_dir, bundle_name=\"spleen_ct_segmentation\")\n", - "trt_model_name = \"model_trt.ts\"\n", - "prepare_tensorrt_model(bundle_path, trt_model_name)" + "```python\n", + "preprocess_transforms = [\n", + " EnsureTyped(keys=\"image\", device=torch.device(\"cuda:0\"), track_meta=True),\n", + " Spacingd(keys=[\"image\"], pixdim=(1.5, 1.5, 2.0), mode=\"bilinear\"),\n", + " ScaleIntensityRanged(\n", + " keys=[\"image\"],\n", + " a_min=-57,\n", + " a_max=164,\n", + " b_min=0.0,\n", + " b_max=1.0,\n", + " clip=True,\n", + " ),\n", + "]\n", + "```" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Benchmark the end-to-end bundle inference\n", + "### 3. Direct GPU Data Loading\n", "\n", - "A variable `benchmark_type` is defined to specify the type of benchmark to run. To have a fair comparison, each benchmark type should be run after restarting the notebook kernel.\n", + "Starting with MONAI `1.4.1rc1`, `monai.data.PydicomReader` and `monai.data.NibabelReader` added the `to_gpu` argument to enable direct GPU data loading. To use this feature, you can set the `to_gpu` argument to `True` when initializing the `LoadImaged` transform. For example:\n", "\n", - "`benchmark_type` can be one of the following:\n", + "```python\n", + "loader = LoadImaged(keys=\"image\", reader=\"NibabelReader\", to_gpu=True)\n", + "```\n", "\n", - "- `\"original\"`: benchmark the original bundle inference.\n", - "- `\"trt\"`: benchmark the TensorRT accelerated bundle inference.\n", - "- `\"trt_gds\"`: benchmark the TensorRT accelerated bundle inference with GPU data loading and GPU transforms." + "Please note that only NIfTI (.nii, for compressed \".nii.gz\" files, this feature also supports but the acceleration is not significant) and DICOM (.dcm) files are supported for direct GPU data loading.\n" ] }, { - "cell_type": "code", - "execution_count": 4, + "cell_type": "markdown", "metadata": {}, - "outputs": [], "source": [ - "benchmark_type = \"trt_gds\"" + "## Benchmarking Strategy\n", + "\n", + "In this section, we will benchmark the acceleration performance on each feature. Specifically, we will benchmark the following inference workflows:\n", + "\n", + "- Original inference workflow\n", + "- TensorRT inference workflow\n", + "- TensorRT inference workflow with GPU-based preprocessing\n", + "- TensorRT inference workflow with direct GPU data loading and GPU-based preprocessing\n", + "\n", + "For each benchmark type, `timeit.default_timer` is used to measure the time taken." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "A `TimerHandler` is defined to benchmark every part of the inference process.\n", + "### Define Benchmark Type\n", + "\n", + "A variable `benchmark_type` is used to specify the type of benchmark to run. To have a fair comparison, each benchmark type should be run after restarting the notebook kernel. `benchmark_type` can be one of the following:\n", "\n", - "Please refer to `utils.py` for the implementation of `CUDATimer` and `TimerHandler`." + "- `\"original\"`: benchmark the original model inference (with `amp` enabled).\n", + "- `\"trt\"`: benchmark the TensorRT accelerated model inference.\n", + "- `\"trt_gpu_transforms\"`: benchmark the TensorRT accelerated model inference with GPU transforms.\n", + "- `\"trt_gds_gpu_transforms\"`: benchmark the TensorRT accelerated model inference with GPU data loading and GPU transforms." ] }, { "cell_type": "code", - "execution_count": 5, + "execution_count": 3, "metadata": {}, "outputs": [], "source": [ - "from utils import TimerHandler, prepare_workflow, benchmark_workflow" + "# please uncomment the expected benchmark type to run\n", + "\n", + "benchmark_type = \"original\"\n", + "# benchmark_type = \"trt\"\n", + "# benchmark_type = \"trt_gpu_transforms\"\n", + "# benchmark_type = \"trt_gds_gpu_transforms\"" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Benchmark the Original Bundle Inference\n", + "### Prepare Data and Model\n", + "\n", + "The [Medical Segmentation Decathlon Task03 Liver dataset](http://medicaldecathlon.com/) is used as an example to benchmark the acceleration performance.A helper script, [`prepare_data.py`](./prepare_data.py), is used to download and extract the dataset. In addition, the script also prepares the model weights and TensorRT engine-based TorchScript model.\n", "\n", - "In this section, the `workflow`runs several iterations to benchmark the latency." + "The script automatically checks for existing data. This ensures that repeated executions of the notebook do not result in redundant operations." ] }, { "cell_type": "code", - "execution_count": 6, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ - "model_weight = os.path.join(bundle_path, \"models\", \"model.pt\")\n", - "meta_config = os.path.join(bundle_path, \"configs\", \"metadata.json\")\n", - "inference_config = os.path.join(bundle_path, \"configs\", \"inference.json\")\n", - "\n", - "override = {\n", - " \"dataset#data\": [{\"image\": i} for i in train_files],\n", - " \"output_postfix\": benchmark_type,\n", - "}" + "from utils import prepare_test_datalist, prepare_model_weights, prepare_tensorrt_model\n", + "\n", + "root_dir = \".\"\n", + "device = torch.device(\"cuda:0\") if torch.cuda.is_available() else torch.device(\"cpu\")\n", + "train_files = prepare_test_datalist(root_dir)\n", + "weights_path = prepare_model_weights(root_dir=root_dir, bundle_name=\"wholeBody_ct_segmentation\")\n", + "trt_model_name = \"model_trt.ts\"\n", + "trt_model_path = prepare_tensorrt_model(root_dir, weights_path, trt_model_name)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Define Inference Components" ] }, { "cell_type": "code", - "execution_count": 7, + "execution_count": 5, "metadata": {}, "outputs": [], "source": [ - "if benchmark_type == \"original\":\n", + "def get_transforms(device, gpu_loading_flag=False, gpu_transforms_flag=False):\n", + " preprocess_transforms = [\n", + " LoadImaged(keys=\"image\", reader=\"NibabelReader\", to_gpu=gpu_loading_flag),\n", + " EnsureChannelFirstd(keys=\"image\"),\n", + " Orientationd(keys=[\"image\"], axcodes=\"RAS\"),\n", + " Spacingd(keys=[\"image\"], pixdim=(1.5, 1.5, 1.5), mode=\"bilinear\"),\n", + " NormalizeIntensityd(keys=\"image\", nonzero=True),\n", + " ScaleIntensityd(\n", + " keys=[\"image\"],\n", + " minv=-1.0,\n", + " maxv=1.0,\n", + " ),\n", + " ]\n", "\n", - " workflow = prepare_workflow(inference_config, meta_config, bundle_path, override)\n", - " torch_timer = TimerHandler()\n", - " benchmark_df = benchmark_workflow(workflow, torch_timer, benchmark_type)" + " if gpu_transforms_flag and not gpu_loading_flag:\n", + " preprocess_transforms.insert(1, EnsureTyped(keys=\"image\", device=device, track_meta=True))\n", + " infer_transforms = Compose(preprocess_transforms)\n", + "\n", + " return infer_transforms\n", + "\n", + "def get_post_transforms(infer_transforms):\n", + " post_transforms = Compose(\n", + " [\n", + " Activationsd(keys=\"pred\", softmax=True),\n", + " AsDiscreted(keys=\"pred\", argmax=True),\n", + " Invertd(\n", + " keys=\"pred\",\n", + " transform=infer_transforms,\n", + " orig_keys=\"image\",\n", + " nearest_interp=True,\n", + " to_tensor=True,\n", + " ),\n", + " ]\n", + " )\n", + " return post_transforms\n", + "\n", + "def get_model(device, weights_path, trt_model_path, trt_flag=False):\n", + " if not trt_flag:\n", + " model = SegResNet(\n", + " spatial_dims=3,\n", + " in_channels=1,\n", + " out_channels=105,\n", + " init_filters=32,\n", + " blocks_down=[1, 2, 2, 4],\n", + " blocks_up=[1, 1, 1],\n", + " dropout_prob=0.2,\n", + " )\n", + " weights = torch.load(weights_path)\n", + " model.load_state_dict(weights)\n", + " model.to(device)\n", + " model.eval()\n", + " else:\n", + " model = torch.jit.load(trt_model_path)\n", + " return model" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Benchmark the TensorRT Accelerated Bundle Inference\n", - "In this part, the TensorRT accelerated model is loaded to the `workflow`. The updated `workflow` runs the same iterations as before to benchmark the latency difference. Since the TensorRT accelerated model cannot be loaded through the `CheckpointLoader` and don't have `amp` mode, disable the `CheckpointLoader` in the `initialize` of the `workflow` and the `amp` parameter in the `evaluator` of the `workflow` needs to be set to `False`." + "### Define Inference Workflow\n", + "\n" ] }, { "cell_type": "code", - "execution_count": 8, + "execution_count": 6, "metadata": {}, "outputs": [], "source": [ - "if benchmark_type == \"trt\":\n", - " trt_model_path = os.path.join(bundle_path, \"models\", \"model_trt.ts\")\n", - " trt_model = torch.jit.load(trt_model_path)\n", + "def run_inference(data_list, infer_transforms, post_transforms, model, device, benchmark_type):\n", + " total_time_dict = {}\n", + " roi_size = (96, 96, 96)\n", + " sw_batch_size = 1\n", + "\n", + " for idx, sample in enumerate(data_list[:5]):\n", + " start = timer()\n", + " data = infer_transforms({\"image\": sample})\n", "\n", - " override[\"load_pretrain\"] = False\n", - " override[\"network_def\"] = trt_model\n", - " override[\"evaluator#amp\"] = False\n", + " with torch.no_grad():\n", + " input_image = data[\"image\"].unsqueeze(0).to(device) if benchmark_type in [\"trt\", \"original\"] else data[\"image\"].unsqueeze(0)\n", + " if benchmark_type == \"original\":\n", + " with torch.autocast(device_type=\"cuda\"):\n", + " output_image = sliding_window_inference(input_image, roi_size, sw_batch_size, model)\n", + " else:\n", + " output_image = sliding_window_inference(input_image, roi_size, sw_batch_size, model)\n", + " \n", + " data[\"pred\"] = output_image.squeeze(0)\n", + " # data = post_transforms(data)\n", + " \n", + " end = timer()\n", "\n", - " workflow = prepare_workflow(inference_config, meta_config, bundle_path, override)\n", - " trt_timer = TimerHandler()\n", - " benchmark_df = benchmark_workflow(workflow, trt_timer, benchmark_type)" + " sample_name = sample.split(\"/\")[-1]\n", + " if idx > 0:\n", + " total_time_dict[sample_name] = end - start\n", + "\n", + " return total_time_dict" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Benchmarking TensorRT Accelerated Bundle Inference with GPU Data Loading and GPU-based Transforms\n", - "\n", - "In the previous section, the inference workflow utilized CPU-based transforms. In this section, we enhance performance by leveraging GPU acceleration:\n", - "\n", - "- **GPU Direct Storage (GDS)**: The `LoadImaged` transform enables GDS on `.nii` and `.dcm` files via specifying `to_gpu=True`.\n", - "- **GPU-based Transforms**: After GDS, subsequent preprocessing transforms are executed directly on the GPU." + "## Benchmark the end-to-end bundle inference" ] }, { "cell_type": "code", - "execution_count": 15, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ - "transforms = Compose([\n", - " LoadImaged(keys=\"image\", reader=\"NibabelReader\", to_gpu=False),\n", - " EnsureTyped(keys=\"image\", device=torch.device(\"cuda:0\")),\n", - " EnsureChannelFirstd(keys=\"image\"),\n", - " Orientationd(keys=\"image\", axcodes=\"RAS\"),\n", - " Spacingd(keys=\"image\", pixdim=[1.5, 1.5, 2.0], mode=\"bilinear\"),\n", - " ScaleIntensityRanged(keys=\"image\", a_min=-57, a_max=164, b_min=0, b_max=1, clip=True),\n", - "])\n", - "\n", - "dataset = Dataset(data=[{\"image\": i} for i in train_files], transform=transforms)\n", - "dataloader = ThreadDataLoader(dataset, batch_size=1, shuffle=False, num_workers=0)" + "gpu_transforms_flag = False\n", + "gpu_loading_flag = False\n", + "trt_flag = False\n", + "\n", + "if \"trt\" in benchmark_type:\n", + " trt_flag = True\n", + "if \"gpu_transforms\" in benchmark_type:\n", + " gpu_transforms_flag = True\n", + "if \"gds\" in benchmark_type:\n", + " gpu_loading_flag = True\n", + "\n", + "infer_transforms = get_transforms(device, gpu_loading_flag, gpu_transforms_flag)\n", + "post_transforms = get_post_transforms(infer_transforms)\n", + "model = get_model(device, weights_path, trt_model_path, trt_flag)\n", + "\n", + "total_time_dict = run_inference(train_files, infer_transforms, post_transforms, model, device, benchmark_type)" ] }, { "cell_type": "code", - "execution_count": null, + "execution_count": 8, "metadata": {}, "outputs": [], "source": [ - "if benchmark_type == \"trt_gds\":\n", - "\n", - " trt_model_path = os.path.join(bundle_path, \"models\", \"model_trt.ts\")\n", - " trt_model = torch.jit.load(trt_model_path)\n", - " override = {\n", - " \"output_postfix\": benchmark_type,\n", - " \"load_pretrain\": False,\n", - " \"network_def\": trt_model,\n", - " \"evaluator#amp\": False,\n", - " \"preprocessing\": transforms,\n", - " \"dataset\": dataset,\n", - " \"dataloader\": dataloader,\n", - " }\n", - "\n", - " workflow = prepare_workflow(inference_config, meta_config, bundle_path, override)\n", - " trt_gpu_trans_timer = TimerHandler()\n", - " benchmark_df = benchmark_workflow(workflow, trt_gpu_trans_timer, benchmark_type)" + "df = pd.DataFrame(list(total_time_dict.items()), columns=[\"file_name\", \"time\"])\n", + "df.to_csv(os.path.join(root_dir, f\"time_{benchmark_type}.csv\"), index=False)" ] } ], diff --git a/acceleration/fast_inference_tutorial/utils.py b/acceleration/fast_inference_tutorial/utils.py index 372f05a906..0e8eec95f4 100644 --- a/acceleration/fast_inference_tutorial/utils.py +++ b/acceleration/fast_inference_tutorial/utils.py @@ -10,34 +10,30 @@ # limitations under the License. -import os import glob +import os import shutil + import monai -import pandas as pd -import numpy as np -from collections import OrderedDict import torch -from ignite.engine import Engine -from ignite.engine import Events -from monai.engines import IterationEvents -from monai.bundle import trt_export from monai.apps import download_and_extract +from monai.data.torchscript_utils import save_net_with_metadata +from monai.networks.nets import SegResNet +from monai.networks.utils import convert_to_trt def prepare_test_datalist(root_dir): - resource = "https://msd-for-monai.s3-us-west-2.amazonaws.com/Task09_Spleen.tar" - md5 = "410d4a301da4e5b2f6f86ec3ddba524e" + resource = "https://msd-for-monai.s3-us-west-2.amazonaws.com/Task03_Liver.tar" - compressed_file = os.path.join(root_dir, "Task09_Spleen.tar") - data_root = os.path.join(root_dir, "Task09_Spleen") + compressed_file = os.path.join(root_dir, "Task03_Liver.tar") + data_root = os.path.join(root_dir, "Task03_Liver") if not os.path.exists(data_root): - download_and_extract(resource, compressed_file, root_dir, md5) + download_and_extract(resource, compressed_file, root_dir) - nii_dir = os.path.join(data_root, "imagesTr_nii") + nii_dir = os.path.join(data_root, "imagesTs_nii") if not os.path.exists(nii_dir): os.makedirs(nii_dir, exist_ok=True) - train_gz_files = sorted(glob.glob(os.path.join(data_root, "imagesTr", "*.nii.gz"))) + train_gz_files = sorted(glob.glob(os.path.join(data_root, "imagesTs", "*.nii.gz"))) for file in train_gz_files: new_file = file.replace(".nii.gz", ".nii") if not os.path.exists(new_file): @@ -46,149 +42,49 @@ def prepare_test_datalist(root_dir): else: print(f"Test data already exists at {nii_dir}") - train_files = sorted(glob.glob(os.path.join(nii_dir, "*.nii"))) - return train_files + files = sorted(glob.glob(os.path.join(nii_dir, "*.nii"))) + return files + +def prepare_model_weights(root_dir, bundle_name="spleen_ct_segmentation"): + bundle_path = os.path.join(root_dir, bundle_name) + weights_path = os.path.join(root_dir, "model.pt") + if not os.path.exists(weights_path): + monai.bundle.download(name=bundle_name, bundle_dir=root_dir) -def prepare_test_bundle(bundle_dir, bundle_name="spleen_ct_segmentation"): - bundle_path = os.path.join(bundle_dir, bundle_name) - if not os.path.exists(bundle_path): - monai.bundle.download(name=bundle_name, bundle_dir=bundle_dir) + weights_original_path = os.path.join(bundle_path, "models", "model.pt") + shutil.copy(weights_original_path, weights_path) else: - print(f"Bundle already exists at {bundle_path}") - return bundle_path - - -def prepare_tensorrt_model(bundle_path, trt_model_name="model_trt.ts"): - output_path = os.path.join(bundle_path, "models", trt_model_name) - if not os.path.exists(output_path): - trt_export( - net_id="network_def", - filepath=output_path, - ckpt_file=os.path.join(bundle_path, "models", "model.pt"), - meta_file=os.path.join(bundle_path, "configs", "metadata.json"), - config_file=os.path.join(bundle_path, "configs", "inference.json"), - precision="fp16", - dynamic_batchsize=[1, 4, 8], - use_onnx=True, - use_trace=True + print(f"Weights already exists at {weights_path}") + + return weights_path + + +def prepare_tensorrt_model(root_dir, weights_path, trt_model_name="model_trt.ts"): + trt_path = os.path.join(root_dir, trt_model_name) + if not os.path.exists(trt_path): + model = SegResNet( + spatial_dims=3, + in_channels=1, + out_channels=105, + init_filters=32, + blocks_down=[1, 2, 2, 4], + blocks_up=[1, 1, 1], + dropout_prob=0.2, ) + weights = torch.load(weights_path) + model.load_state_dict(weights) + torchscript_model = convert_to_trt( + model=model, + precision="fp32", + input_shape=[1, 1, 96, 96, 96], + dynamic_batchsize=[1, 1, 1], + use_trace=True, + verify=False, + ) + + save_net_with_metadata(torchscript_model, trt_model_name.split(".")[0]) else: - print(f"TensorRT model already exists at {output_path}") - - -class CUDATimer: - def __init__(self, type_str) -> None: - self.time_list = [] - self.type_str = type_str - - def start(self) -> None: - self.starter = torch.cuda.Event(enable_timing=True) - self.ender = torch.cuda.Event(enable_timing=True) - torch.cuda.synchronize() - self.starter.record() - - def end(self) -> None: - self.ender.record() - torch.cuda.synchronize() - self.time_list.append(self.starter.elapsed_time(self.ender)) - - def get_max(self) -> float: - return max(self.time_list) - - def get_min(self) -> float: - return min(self.time_list) - - def get_mean(self) -> float: - np_time = np.array(self.time_list) - return np.mean(np_time) - - def get_std(self) -> float: - np_time = np.array(self.time_list) - return np.std(np_time) - - def get_sum(self) -> float: - np_time = np.array(self.time_list) - return np.sum(np_time) - - def get_results_dict(self) -> OrderedDict: - out_list = [ - ("total", self.get_sum()), - ("min", self.get_min()), - ("max", self.get_max()), - ("mean", self.get_mean()), - ("std", self.get_std()), - ] - return OrderedDict(out_list) - - -class TimerHandler: - def __init__(self) -> None: - self.run_timer = CUDATimer("RUN") - self.epoch_timer = CUDATimer("EPOCH") - self.iteration_timer = CUDATimer("ITERATION") - self.net_forward_timer = CUDATimer("FORWARD") - self.get_batch_timer = CUDATimer("PREPARE_BATCH") - self.post_process_timer = CUDATimer("POST_PROCESS") - self.timer_list = [ - self.run_timer, - self.epoch_timer, - self.iteration_timer, - self.net_forward_timer, - self.get_batch_timer, - self.post_process_timer, - ] - - def attach(self, engine: Engine) -> None: - engine.add_event_handler(Events.STARTED, self.started, timer=self.run_timer) - engine.add_event_handler(Events.EPOCH_STARTED, self.started, timer=self.epoch_timer) - engine.add_event_handler(Events.ITERATION_STARTED, self.started, timer=self.iteration_timer) - engine.add_event_handler(Events.GET_BATCH_STARTED, self.started, timer=self.get_batch_timer) - engine.add_event_handler(Events.GET_BATCH_COMPLETED, self.completed, timer=self.get_batch_timer) - engine.add_event_handler(Events.GET_BATCH_COMPLETED, self.started, timer=self.net_forward_timer) - engine.add_event_handler(IterationEvents.FORWARD_COMPLETED, self.completed, timer=self.net_forward_timer) - engine.add_event_handler(IterationEvents.FORWARD_COMPLETED, self.started, timer=self.post_process_timer) - engine.add_event_handler(Events.ITERATION_COMPLETED, self.completed, timer=self.post_process_timer) - engine.add_event_handler(Events.ITERATION_COMPLETED, self.completed, timer=self.iteration_timer) - engine.add_event_handler(Events.EPOCH_COMPLETED, self.completed, timer=self.epoch_timer) - engine.add_event_handler(Events.COMPLETED, self.completed, timer=self.run_timer) - - def started(self, engine: Engine, timer: CUDATimer) -> None: - timer.start() - - def completed(self, engine: Engine, timer: CUDATimer) -> None: - timer.end() - - def print_results(self) -> None: - index = [x.type_str for x in self.timer_list] - column_title = list(self.timer_list[0].get_results_dict().keys()) - column_title = [x + "/ms" for x in column_title] - latency_list = [x for timer in self.timer_list for x in timer.get_results_dict().values()] - latency_array = np.array(latency_list) - latency_array = np.reshape(latency_array, (len(index), len(column_title))) - df = pd.DataFrame(latency_array, index=index, columns=column_title) - return df - - -def prepare_workflow(inference_config, meta_config, bundle_path, override): - workflow = monai.bundle.ConfigWorkflow( - workflow="infer", - config_file=inference_config, - meta_file=meta_config, - logging_file=os.path.join(bundle_path, "configs", "logging.conf"), - bundle_root=bundle_path, - **override, - ) - - return workflow - -def benchmark_workflow(workflow, timer, benchmark_type): - workflow.initialize() - timer.attach(workflow.evaluator) - workflow.run() - workflow.finalize() - - benchmark_df = timer.print_results() - benchmark_df.to_csv(f"benchmark_{benchmark_type}.csv") - - return benchmark_df + print(f"TensorRT model already exists at {trt_path}") + + return os.path.join(root_dir, trt_model_name) From 5bd3f675cd64884935cd8193fd6e0466f192ac73 Mon Sep 17 00:00:00 2001 From: "pre-commit-ci[bot]" <66853113+pre-commit-ci[bot]@users.noreply.github.com> Date: Fri, 7 Mar 2025 04:55:03 +0000 Subject: [PATCH 04/11] [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci --- .../fast_inference_tutorial.ipynb | 14 ++++++++++---- 1 file changed, 10 insertions(+), 4 deletions(-) diff --git a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb index e5affc4c58..5e75e707ab 100644 --- a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb +++ b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb @@ -109,7 +109,7 @@ " Invertd,\n", " Activationsd,\n", " AsDiscreted,\n", - " Compose\n", + " Compose,\n", ")\n", "from monai.inferers import sliding_window_inference\n", "from monai.networks.nets import SegResNet\n", @@ -316,6 +316,7 @@ "\n", " return infer_transforms\n", "\n", + "\n", "def get_post_transforms(infer_transforms):\n", " post_transforms = Compose(\n", " [\n", @@ -332,6 +333,7 @@ " )\n", " return post_transforms\n", "\n", + "\n", "def get_model(device, weights_path, trt_model_path, trt_flag=False):\n", " if not trt_flag:\n", " model = SegResNet(\n", @@ -376,16 +378,20 @@ " data = infer_transforms({\"image\": sample})\n", "\n", " with torch.no_grad():\n", - " input_image = data[\"image\"].unsqueeze(0).to(device) if benchmark_type in [\"trt\", \"original\"] else data[\"image\"].unsqueeze(0)\n", + " input_image = (\n", + " data[\"image\"].unsqueeze(0).to(device)\n", + " if benchmark_type in [\"trt\", \"original\"]\n", + " else data[\"image\"].unsqueeze(0)\n", + " )\n", " if benchmark_type == \"original\":\n", " with torch.autocast(device_type=\"cuda\"):\n", " output_image = sliding_window_inference(input_image, roi_size, sw_batch_size, model)\n", " else:\n", " output_image = sliding_window_inference(input_image, roi_size, sw_batch_size, model)\n", - " \n", + "\n", " data[\"pred\"] = output_image.squeeze(0)\n", " # data = post_transforms(data)\n", - " \n", + "\n", " end = timer()\n", "\n", " sample_name = sample.split(\"/\")[-1]\n", From 45b5da33ec00cbb81024ea9e382837bdd30080c6 Mon Sep 17 00:00:00 2001 From: Yiheng Wang Date: Fri, 7 Mar 2025 09:43:58 +0000 Subject: [PATCH 05/11] add scripts and update notebook Signed-off-by: Yiheng Wang --- .../fast_inference_tutorial.ipynb | 161 ++++++++++++------ .../fast_inference_tutorial/run_benchmark.py | 150 ++++++++++++++++ acceleration/fast_inference_tutorial/utils.py | 2 +- 3 files changed, 263 insertions(+), 50 deletions(-) create mode 100644 acceleration/fast_inference_tutorial/run_benchmark.py diff --git a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb index 5e75e707ab..23d18260b1 100644 --- a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb +++ b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb @@ -72,10 +72,8 @@ "!python -c \"import matplotlib\" || pip install -q matplotlib\n", "!python -c \"import torch_tensorrt\" || pip install torch_tensorrt\n", "!python -c \"import kvikio\" || pip install kvikio-cu12\n", - "!python -c \"import ignite\" || pip install pytorch-ignite\n", "!python -c \"import pandas\" || pip install pandas\n", "!python -c \"import requests\" || pip install requests\n", - "!python -c \"import fire\" || pip install fire\n", "!python -c \"import onnx\" || pip install onnx\n", "%matplotlib inline" ] @@ -106,19 +104,16 @@ " Spacingd,\n", " NormalizeIntensityd,\n", " ScaleIntensityd,\n", - " Invertd,\n", - " Activationsd,\n", - " AsDiscreted,\n", " Compose,\n", ")\n", "from monai.inferers import sliding_window_inference\n", "from monai.networks.nets import SegResNet\n", + "import matplotlib.pyplot as plt\n", "import torch\n", + "import gc\n", "import pandas as pd\n", "from timeit import default_timer as timer\n", "\n", - "print(f\"Torch-TensorRT version: {torch_tensorrt.__version__}.\")\n", - "\n", "print_config()" ] }, @@ -163,8 +158,8 @@ " precision=\"fp16\",\n", " input_shape=[1, 1, 96, 96, 96],\n", " dynamic_batchsize=[1, 1, 1],\n", - " use_trace=False,\n", - " verify=True,\n", + " use_trace=True,\n", + " verify=False,\n", ")\n", "\n", "save_net_with_metadata(torchscript_model, \"segresnet_trt\")\n", @@ -236,15 +231,15 @@ "\n", "A variable `benchmark_type` is used to specify the type of benchmark to run. To have a fair comparison, each benchmark type should be run after restarting the notebook kernel. `benchmark_type` can be one of the following:\n", "\n", - "- `\"original\"`: benchmark the original model inference (with `amp` enabled).\n", + "- `\"original\"`: benchmark the original model inference.\n", "- `\"trt\"`: benchmark the TensorRT accelerated model inference.\n", - "- `\"trt_gpu_transforms\"`: benchmark the TensorRT accelerated model inference with GPU transforms.\n", - "- `\"trt_gds_gpu_transforms\"`: benchmark the TensorRT accelerated model inference with GPU data loading and GPU transforms." + "- `\"trt_gpu_transforms\"`: benchmark the model inference with GPU transforms.\n", + "- `\"trt_gds_gpu_transforms\"`: benchmark the model inference with GPU data loading and GPU transforms." ] }, { "cell_type": "code", - "execution_count": 3, + "execution_count": 4, "metadata": {}, "outputs": [], "source": [ @@ -276,8 +271,12 @@ "from utils import prepare_test_datalist, prepare_model_weights, prepare_tensorrt_model\n", "\n", "root_dir = \".\"\n", + "torch.backends.cudnn.benchmark = True\n", + "torch_tensorrt.runtime.set_multi_device_safe_mode(True)\n", "device = torch.device(\"cuda:0\") if torch.cuda.is_available() else torch.device(\"cpu\")\n", "train_files = prepare_test_datalist(root_dir)\n", + "# since the dataset is too large, the smallest 21 files are used for warm up (1 file) and benchmarking (11 files)\n", + "train_files = sorted(train_files, key=lambda x: os.path.getsize(x), reverse=False)[:21]\n", "weights_path = prepare_model_weights(root_dir=root_dir, bundle_name=\"wholeBody_ct_segmentation\")\n", "trt_model_name = \"model_trt.ts\"\n", "trt_model_path = prepare_tensorrt_model(root_dir, weights_path, trt_model_name)" @@ -292,7 +291,7 @@ }, { "cell_type": "code", - "execution_count": 5, + "execution_count": 6, "metadata": {}, "outputs": [], "source": [ @@ -317,23 +316,6 @@ " return infer_transforms\n", "\n", "\n", - "def get_post_transforms(infer_transforms):\n", - " post_transforms = Compose(\n", - " [\n", - " Activationsd(keys=\"pred\", softmax=True),\n", - " AsDiscreted(keys=\"pred\", argmax=True),\n", - " Invertd(\n", - " keys=\"pred\",\n", - " transform=infer_transforms,\n", - " orig_keys=\"image\",\n", - " nearest_interp=True,\n", - " to_tensor=True,\n", - " ),\n", - " ]\n", - " )\n", - " return post_transforms\n", - "\n", - "\n", "def get_model(device, weights_path, trt_model_path, trt_flag=False):\n", " if not trt_flag:\n", " model = SegResNet(\n", @@ -364,16 +346,16 @@ }, { "cell_type": "code", - "execution_count": 6, + "execution_count": 7, "metadata": {}, "outputs": [], "source": [ - "def run_inference(data_list, infer_transforms, post_transforms, model, device, benchmark_type):\n", + "def run_inference(data_list, infer_transforms, model, device, benchmark_type):\n", " total_time_dict = {}\n", " roi_size = (96, 96, 96)\n", " sw_batch_size = 1\n", - "\n", - " for idx, sample in enumerate(data_list[:5]):\n", + " \n", + " for idx, sample in enumerate(data_list[:10]):\n", " start = timer()\n", " data = infer_transforms({\"image\": sample})\n", "\n", @@ -383,21 +365,24 @@ " if benchmark_type in [\"trt\", \"original\"]\n", " else data[\"image\"].unsqueeze(0)\n", " )\n", - " if benchmark_type == \"original\":\n", - " with torch.autocast(device_type=\"cuda\"):\n", - " output_image = sliding_window_inference(input_image, roi_size, sw_batch_size, model)\n", - " else:\n", - " output_image = sliding_window_inference(input_image, roi_size, sw_batch_size, model)\n", "\n", - " data[\"pred\"] = output_image.squeeze(0)\n", - " # data = post_transforms(data)\n", + " output_image = sliding_window_inference(input_image, roi_size, sw_batch_size, model)\n", + " output_image = output_image.cpu()\n", "\n", " end = timer()\n", "\n", + " print(output_image.mean())\n", + "\n", + " del data\n", + " del input_image\n", + " del output_image\n", + " torch.cuda.empty_cache()\n", + " gc.collect()\n", + "\n", " sample_name = sample.split(\"/\")[-1]\n", " if idx > 0:\n", " total_time_dict[sample_name] = end - start\n", - "\n", + " print(end - start)\n", " return total_time_dict" ] }, @@ -405,7 +390,20 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Benchmark the end-to-end bundle inference" + "### Running the Benchmark\n", + "\n", + "The cell below will execute the benchmark based on the `benchmark_type` variable.\n", + "\n", + "#### Optional: Using the Python Script\n", + "\n", + "For convenience, a Python script, [`run_benchmark.py`](./run_benchmark.py), is available to run the benchmark. You can open a terminal and execute the following command to run the benchmark for all benchmark types:\n", + "\n", + "\n", + "```bash\n", + "for benchmark_type in \"original\" \"trt\" \"trt_gpu_transforms\" \"trt_gds_gpu_transforms\"; do\n", + " python run_benchmark.py --benchmark_type \"$benchmark_type\"\n", + "done\n", + "```" ] }, { @@ -426,21 +424,86 @@ " gpu_loading_flag = True\n", "\n", "infer_transforms = get_transforms(device, gpu_loading_flag, gpu_transforms_flag)\n", - "post_transforms = get_post_transforms(infer_transforms)\n", "model = get_model(device, weights_path, trt_model_path, trt_flag)\n", "\n", - "total_time_dict = run_inference(train_files, infer_transforms, post_transforms, model, device, benchmark_type)" + "total_time_dict = run_inference(train_files, infer_transforms, model, device, benchmark_type)\n", + "\n", + "df = pd.DataFrame(list(total_time_dict.items()), columns=[\"file_name\", \"time\"])\n", + "df.to_csv(os.path.join(root_dir, f\"time_{benchmark_type}.csv\"), index=False)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Analyze and Visualize the Results\n", + "\n", + "In this section, we will analyze and visualize the results.\n", + "All cell outputs presented in this section were obtained by a NVIDIA RTX A6000 GPU." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Collect Benchmark Results" ] }, { "cell_type": "code", - "execution_count": 8, + "execution_count": 18, "metadata": {}, "outputs": [], "source": [ - "df = pd.DataFrame(list(total_time_dict.items()), columns=[\"file_name\", \"time\"])\n", - "df.to_csv(os.path.join(root_dir, f\"time_{benchmark_type}.csv\"), index=False)" + "# collect benchmark results\n", + "all_df = pd.read_csv(os.path.join(root_dir, f\"time_original.csv\"))\n", + "all_df.columns = [\"file_name\", \"original_time\"]\n", + "\n", + "for benchmark_type in [\"trt\", \"trt_gpu_transforms\", \"trt_gds_gpu_transforms\"]:\n", + " df = pd.read_csv(os.path.join(root_dir, f\"time_{benchmark_type}.csv\"))\n", + " df.columns = [\"file_name\", f\"{benchmark_type}_time\"]\n", + " all_df = pd.merge(all_df, df, on=\"file_name\", how=\"left\")\n", + "\n", + "# for each file, add it's size\n", + "all_df[\"file_size\"] = all_df[\"file_name\"].apply(lambda x: os.path.getsize(os.path.join(root_dir, \"Task03_Liver\", \"imagesTs_nii\", x)))\n", + "# sort by file size\n", + "all_df = all_df.sort_values(by=\"file_size\", ascending=True)\n", + "# convert file size to MB\n", + "all_df[\"file_size\"] = all_df[\"file_size\"] / 1024 / 1024\n", + "# get the average time for each benchmark type\n", + "average_time = all_df.mean(numeric_only=True)\n", + "del average_time[\"file_size\"]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Visualize Average Inference Time for Each Benchmark Type" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "plt.figure(figsize=(10, 6))\n", + "average_time.plot(kind='bar', color=['skyblue', 'orange', 'green', 'red'])\n", + "plt.title('Average Inference Time for Each Benchmark Type')\n", + "plt.xlabel('Benchmark Type')\n", + "plt.ylabel('Average Time (seconds)')\n", + "plt.xticks(rotation=45)\n", + "plt.tight_layout()\n", + "plt.show()" ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] } ], "metadata": { diff --git a/acceleration/fast_inference_tutorial/run_benchmark.py b/acceleration/fast_inference_tutorial/run_benchmark.py new file mode 100644 index 0000000000..0ec96df3d1 --- /dev/null +++ b/acceleration/fast_inference_tutorial/run_benchmark.py @@ -0,0 +1,150 @@ +# Copyright (c) MONAI Consortium +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# http://www.apache.org/licenses/LICENSE-2.0 +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + + +import argparse +import gc +import os +from timeit import default_timer as timer + +import pandas as pd +import torch +import torch_tensorrt +from monai.inferers import sliding_window_inference +from monai.networks.nets import SegResNet +from monai.transforms import (Activationsd, AsDiscreted, Compose, + EnsureChannelFirstd, EnsureTyped, Invertd, + LoadImaged, NormalizeIntensityd, Orientationd, + ScaleIntensityd, Spacingd) + +from utils import (prepare_model_weights, prepare_tensorrt_model, + prepare_test_datalist) + + +def get_transforms(device, gpu_loading_flag=False, gpu_transforms_flag=False): + preprocess_transforms = [ + LoadImaged(keys="image", reader="NibabelReader", to_gpu=gpu_loading_flag), + EnsureChannelFirstd(keys="image"), + Orientationd(keys=["image"], axcodes="RAS"), + Spacingd(keys=["image"], pixdim=(1.5, 1.5, 1.5), mode="bilinear"), + NormalizeIntensityd(keys="image", nonzero=True), + ScaleIntensityd( + keys=["image"], + minv=-1.0, + maxv=1.0, + ), + ] + + if gpu_transforms_flag and not gpu_loading_flag: + preprocess_transforms.insert(1, EnsureTyped(keys="image", device=device, track_meta=True)) + infer_transforms = Compose(preprocess_transforms) + + return infer_transforms + +def get_post_transforms(infer_transforms): + post_transforms = Compose( + [ + Activationsd(keys="pred", softmax=True), + AsDiscreted(keys="pred", argmax=True), + Invertd( + keys="pred", + transform=infer_transforms, + orig_keys="image", + nearest_interp=True, + to_tensor=True, + ), + ] + ) + return post_transforms + +def get_model(device, weights_path, trt_model_path, trt_flag=False): + if not trt_flag: + model = SegResNet( + spatial_dims=3, + in_channels=1, + out_channels=105, + init_filters=32, + blocks_down=[1, 2, 2, 4], + blocks_up=[1, 1, 1], + dropout_prob=0.2, + ) + weights = torch.load(weights_path) + model.load_state_dict(weights) + model.to(device) + model.eval() + else: + model = torch.jit.load(trt_model_path) + return model + +def run_inference(data_list, infer_transforms, model, device, benchmark_type): + total_time_dict = {} + roi_size = (96, 96, 96) + sw_batch_size = 1 + + for idx, sample in enumerate(data_list): + start = timer() + data = infer_transforms({"image": sample}) + + with torch.no_grad(): + input_image = ( + data["image"].unsqueeze(0).to(device) + if benchmark_type in ["trt", "original"] + else data["image"].unsqueeze(0) + ) + + output_image = sliding_window_inference(input_image, roi_size, sw_batch_size, model) + output_image = output_image.cpu() + + end = timer() + + del data + del input_image + del output_image + torch.cuda.empty_cache() + gc.collect() + + sample_name = sample.split("/")[-1] + if idx > 0: + total_time_dict[sample_name] = end - start + + return total_time_dict + +def main(): + parser = argparse.ArgumentParser(description="Run inference benchmark.") + parser.add_argument("--benchmark_type", type=str, default="original", help="Type of benchmark to run") + args = parser.parse_args() + + ### Prepare the environment + root_dir = "." + torch.backends.cudnn.benchmark = True + torch_tensorrt.runtime.set_multi_device_safe_mode(True) + device = torch.device("cuda:0") if torch.cuda.is_available() else torch.device("cpu") + train_files = prepare_test_datalist(root_dir) + # since the dataset is too large, the smallest 21 files are used for warm up (1 file) and benchmarking (11 files) + train_files = sorted(train_files, key=lambda x: os.path.getsize(x), reverse=False)[:21] + weights_path = prepare_model_weights(root_dir=root_dir, bundle_name="wholeBody_ct_segmentation") + trt_model_name = "model_trt.ts" + trt_model_path = prepare_tensorrt_model(root_dir, weights_path, trt_model_name) + + gpu_transforms_flag = "gpu_transforms" in args.benchmark_type + gpu_loading_flag = "gds" in args.benchmark_type + trt_flag = "trt" in args.benchmark_type + # Get components + infer_transforms = get_transforms(device, gpu_loading_flag, gpu_transforms_flag) + model = get_model(device, weights_path, trt_model_path, trt_flag) + # Run inference + total_time_dict = run_inference(train_files, infer_transforms, model, device, args.benchmark_type) + # Save the results + df = pd.DataFrame(list(total_time_dict.items()), columns=["file_name", "time"]) + df.to_csv(os.path.join(root_dir, f"time_{args.benchmark_type}.csv"), index=False) + +if __name__ == "__main__": + main() diff --git a/acceleration/fast_inference_tutorial/utils.py b/acceleration/fast_inference_tutorial/utils.py index 0e8eec95f4..ac14f55845 100644 --- a/acceleration/fast_inference_tutorial/utils.py +++ b/acceleration/fast_inference_tutorial/utils.py @@ -76,7 +76,7 @@ def prepare_tensorrt_model(root_dir, weights_path, trt_model_name="model_trt.ts" model.load_state_dict(weights) torchscript_model = convert_to_trt( model=model, - precision="fp32", + precision="fp16", input_shape=[1, 1, 96, 96, 96], dynamic_batchsize=[1, 1, 1], use_trace=True, From dc1c24f444c6d4ea2873b3ebe075fc35cdf4789e Mon Sep 17 00:00:00 2001 From: "pre-commit-ci[bot]" <66853113+pre-commit-ci[bot]@users.noreply.github.com> Date: Fri, 7 Mar 2025 09:45:53 +0000 Subject: [PATCH 06/11] [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci --- .../fast_inference_tutorial.ipynb | 14 +++++---- .../fast_inference_tutorial/run_benchmark.py | 29 ++++++++++++++----- 2 files changed, 29 insertions(+), 14 deletions(-) diff --git a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb index 23d18260b1..867bb691be 100644 --- a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb +++ b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb @@ -354,7 +354,7 @@ " total_time_dict = {}\n", " roi_size = (96, 96, 96)\n", " sw_batch_size = 1\n", - " \n", + "\n", " for idx, sample in enumerate(data_list[:10]):\n", " start = timer()\n", " data = infer_transforms({\"image\": sample})\n", @@ -465,7 +465,9 @@ " all_df = pd.merge(all_df, df, on=\"file_name\", how=\"left\")\n", "\n", "# for each file, add it's size\n", - "all_df[\"file_size\"] = all_df[\"file_name\"].apply(lambda x: os.path.getsize(os.path.join(root_dir, \"Task03_Liver\", \"imagesTs_nii\", x)))\n", + "all_df[\"file_size\"] = all_df[\"file_name\"].apply(\n", + " lambda x: os.path.getsize(os.path.join(root_dir, \"Task03_Liver\", \"imagesTs_nii\", x))\n", + ")\n", "# sort by file size\n", "all_df = all_df.sort_values(by=\"file_size\", ascending=True)\n", "# convert file size to MB\n", @@ -489,10 +491,10 @@ "outputs": [], "source": [ "plt.figure(figsize=(10, 6))\n", - "average_time.plot(kind='bar', color=['skyblue', 'orange', 'green', 'red'])\n", - "plt.title('Average Inference Time for Each Benchmark Type')\n", - "plt.xlabel('Benchmark Type')\n", - "plt.ylabel('Average Time (seconds)')\n", + "average_time.plot(kind=\"bar\", color=[\"skyblue\", \"orange\", \"green\", \"red\"])\n", + "plt.title(\"Average Inference Time for Each Benchmark Type\")\n", + "plt.xlabel(\"Benchmark Type\")\n", + "plt.ylabel(\"Average Time (seconds)\")\n", "plt.xticks(rotation=45)\n", "plt.tight_layout()\n", "plt.show()" diff --git a/acceleration/fast_inference_tutorial/run_benchmark.py b/acceleration/fast_inference_tutorial/run_benchmark.py index 0ec96df3d1..a825988310 100644 --- a/acceleration/fast_inference_tutorial/run_benchmark.py +++ b/acceleration/fast_inference_tutorial/run_benchmark.py @@ -20,13 +20,21 @@ import torch_tensorrt from monai.inferers import sliding_window_inference from monai.networks.nets import SegResNet -from monai.transforms import (Activationsd, AsDiscreted, Compose, - EnsureChannelFirstd, EnsureTyped, Invertd, - LoadImaged, NormalizeIntensityd, Orientationd, - ScaleIntensityd, Spacingd) - -from utils import (prepare_model_weights, prepare_tensorrt_model, - prepare_test_datalist) +from monai.transforms import ( + Activationsd, + AsDiscreted, + Compose, + EnsureChannelFirstd, + EnsureTyped, + Invertd, + LoadImaged, + NormalizeIntensityd, + Orientationd, + ScaleIntensityd, + Spacingd, +) + +from utils import prepare_model_weights, prepare_tensorrt_model, prepare_test_datalist def get_transforms(device, gpu_loading_flag=False, gpu_transforms_flag=False): @@ -49,6 +57,7 @@ def get_transforms(device, gpu_loading_flag=False, gpu_transforms_flag=False): return infer_transforms + def get_post_transforms(infer_transforms): post_transforms = Compose( [ @@ -65,6 +74,7 @@ def get_post_transforms(infer_transforms): ) return post_transforms + def get_model(device, weights_path, trt_model_path, trt_flag=False): if not trt_flag: model = SegResNet( @@ -84,11 +94,12 @@ def get_model(device, weights_path, trt_model_path, trt_flag=False): model = torch.jit.load(trt_model_path) return model + def run_inference(data_list, infer_transforms, model, device, benchmark_type): total_time_dict = {} roi_size = (96, 96, 96) sw_batch_size = 1 - + for idx, sample in enumerate(data_list): start = timer() data = infer_transforms({"image": sample}) @@ -117,6 +128,7 @@ def run_inference(data_list, infer_transforms, model, device, benchmark_type): return total_time_dict + def main(): parser = argparse.ArgumentParser(description="Run inference benchmark.") parser.add_argument("--benchmark_type", type=str, default="original", help="Type of benchmark to run") @@ -146,5 +158,6 @@ def main(): df = pd.DataFrame(list(total_time_dict.items()), columns=["file_name", "time"]) df.to_csv(os.path.join(root_dir, f"time_{args.benchmark_type}.csv"), index=False) + if __name__ == "__main__": main() From f4840a76f2c071f1d12b96342e33d9e2966f561e Mon Sep 17 00:00:00 2001 From: Yiheng Wang Date: Sat, 8 Mar 2025 03:53:36 +0000 Subject: [PATCH 07/11] finalize report Signed-off-by: Yiheng Wang --- .../fast_inference_tutorial.ipynb | 182 ++++++++++++++---- .../fast_inference_tutorial/run_benchmark.py | 15 +- acceleration/fast_inference_tutorial/utils.py | 2 +- runner.sh | 2 + 4 files changed, 156 insertions(+), 45 deletions(-) diff --git a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb index 23d18260b1..59864f534b 100644 --- a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb +++ b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb @@ -24,13 +24,17 @@ "source": [ "## Accelerating Model Inference with MONAI\n", "\n", - "In this tutorial, we explore three powerful features that can accelerate model inference using MONAI. These features are designed to optimize the data handling and computational efficiency of your inference pipeline, particularly when working with NVIDIA GPUs. The tutorial will guide you through the following features and provide a comprehensive benchmarking strategy to evaluate the performance improvements offered by each feature:\n", + "In the rapidly evolving field of medical imaging, the ability to perform fast and efficient model inference is crucial for real-time diagnostics and treatment planning. This tutorial explores three advanced features of the MONAI framework that are designed to significantly accelerate model inference, particularly when leveraging the computational power of NVIDIA GPUs.\n", "\n", - "1. **TensorRT Inference**: Utilize NVIDIA's TensorRT to optimize and execute models for high-performance inference on NVIDIA GPUs.\n", + "1. **TensorRT Inference**: Learn how to utilize NVIDIA's TensorRT to optimize and execute models for high-performance inference, reducing latency and improving throughput.\n", "\n", - "2. **GPU-Based Preprocessing**: Leverage the computational power of GPUs to perform data preprocessing directly on the GPU. This can significantly reduce the time spent on data preparation, enabling faster inference.\n", + "2. **GPU-Based Preprocessing**: Discover how to offload data preprocessing tasks to the GPU, minimizing CPU bottlenecks and accelerating the overall inference pipeline.\n", "\n", - "3. **Direct GPU Data Loading**: Minimize data transfer times by loading data directly from disk into GPU memory. This feature supports NIfTI and DICOM file formats." + "3. **Direct GPU Data Loading**: Understand the benefits of loading data directly from disk into GPU memory, which reduces data transfer times and enhances processing efficiency.\n", + "\n", + "In addition to exploring these features, this tutorial provides a comprehensive benchmarking strategy to evaluate the performance improvements offered by each feature. We will use MONAI's [wholeBody_ct_segmentation](https://github.com/Project-MONAI/model-zoo/tree/dev/models/wholeBody_ct_segmentation) as a reference and build a Liver CT segmentation model for benchmarking purposes.\n", + "\n", + "Finally, we will analyze and visualize the benchmark results, offering insights into the performance gains achieved through these optimizations. By the end of this tutorial, you will have a deeper understanding of how to leverage MONAI's capabilities to enhance the efficiency of your medical imaging workflows." ] }, { @@ -121,7 +125,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Introduction on Fast Inference Features\n" + "## Introduction on Fast Inference Features" ] }, { @@ -136,7 +140,7 @@ "\n", "example:\n", "\n", - "```python\n", + "```py\n", "\n", "from monai.networks.nets import SegResNet\n", "from monai.networks.utils import convert_to_trt\n", @@ -157,7 +161,7 @@ " model=model,\n", " precision=\"fp16\",\n", " input_shape=[1, 1, 96, 96, 96],\n", - " dynamic_batchsize=[1, 1, 1],\n", + " dynamic_batchsize=[1, 4, 4],\n", " use_trace=True,\n", " verify=False,\n", ")\n", @@ -239,7 +243,7 @@ }, { "cell_type": "code", - "execution_count": 4, + "execution_count": 2, "metadata": {}, "outputs": [], "source": [ @@ -257,16 +261,28 @@ "source": [ "### Prepare Data and Model\n", "\n", - "The [Medical Segmentation Decathlon Task03 Liver dataset](http://medicaldecathlon.com/) is used as an example to benchmark the acceleration performance.A helper script, [`prepare_data.py`](./prepare_data.py), is used to download and extract the dataset. In addition, the script also prepares the model weights and TensorRT engine-based TorchScript model.\n", + "The [Medical Segmentation Decathlon Task03 Liver dataset](http://medicaldecathlon.com/) is used as an example to benchmark the acceleration performance.\n", + "\n", + "A helper script, [`prepare_data.py`](./prepare_data.py), is used to download and extract the dataset. In addition, the script also prepares the model weights and TensorRT engine-based TorchScript model.\n", "\n", "The script automatically checks for existing data. This ensures that repeated executions of the notebook do not result in redundant operations." ] }, { "cell_type": "code", - "execution_count": null, + "execution_count": 3, "metadata": {}, - "outputs": [], + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Test data already exists at ./Task03_Liver/imagesTs_nii\n", + "Weights already exists at ./model.pt\n", + "TensorRT model already exists at ./model_trt.ts\n" + ] + } + ], "source": [ "from utils import prepare_test_datalist, prepare_model_weights, prepare_tensorrt_model\n", "\n", @@ -275,8 +291,8 @@ "torch_tensorrt.runtime.set_multi_device_safe_mode(True)\n", "device = torch.device(\"cuda:0\") if torch.cuda.is_available() else torch.device(\"cpu\")\n", "train_files = prepare_test_datalist(root_dir)\n", - "# since the dataset is too large, the smallest 21 files are used for warm up (1 file) and benchmarking (11 files)\n", - "train_files = sorted(train_files, key=lambda x: os.path.getsize(x), reverse=False)[:21]\n", + "# since the dataset is too large, the smallest 31 files are used for warm up (1 file) and benchmarking (30 files)\n", + "train_files = sorted(train_files, key=lambda x: os.path.getsize(x), reverse=False)[:31]\n", "weights_path = prepare_model_weights(root_dir=root_dir, bundle_name=\"wholeBody_ct_segmentation\")\n", "trt_model_name = \"model_trt.ts\"\n", "trt_model_path = prepare_tensorrt_model(root_dir, weights_path, trt_model_name)" @@ -291,7 +307,7 @@ }, { "cell_type": "code", - "execution_count": 6, + "execution_count": 4, "metadata": {}, "outputs": [], "source": [ @@ -346,14 +362,14 @@ }, { "cell_type": "code", - "execution_count": 7, + "execution_count": 5, "metadata": {}, "outputs": [], "source": [ "def run_inference(data_list, infer_transforms, model, device, benchmark_type):\n", " total_time_dict = {}\n", " roi_size = (96, 96, 96)\n", - " sw_batch_size = 1\n", + " sw_batch_size = 4\n", " \n", " for idx, sample in enumerate(data_list[:10]):\n", " start = timer()\n", @@ -394,7 +410,7 @@ "\n", "The cell below will execute the benchmark based on the `benchmark_type` variable.\n", "\n", - "#### Optional: Using the Python Script\n", + "#### (Optional) Using the Python Script\n", "\n", "For convenience, a Python script, [`run_benchmark.py`](./run_benchmark.py), is available to run the benchmark. You can open a terminal and execute the following command to run the benchmark for all benchmark types:\n", "\n", @@ -442,16 +458,9 @@ "All cell outputs presented in this section were obtained by a NVIDIA RTX A6000 GPU." ] }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Collect Benchmark Results" - ] - }, { "cell_type": "code", - "execution_count": 18, + "execution_count": 12, "metadata": {}, "outputs": [], "source": [ @@ -470,47 +479,142 @@ "all_df = all_df.sort_values(by=\"file_size\", ascending=True)\n", "# convert file size to MB\n", "all_df[\"file_size\"] = all_df[\"file_size\"] / 1024 / 1024\n", - "# get the average time for each benchmark type\n", - "average_time = all_df.mean(numeric_only=True)\n", - "del average_time[\"file_size\"]" + "# get the total time for each benchmark type\n", + "total_time = all_df.sum(numeric_only=True)\n", + "del total_time[\"file_size\"]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Visualize Average Inference Time for Each Benchmark Type" + "### Analyze the Total Inference Time\n", + "\n", + "- TensorRT Improvement:\n", + "Switching from the original model to TensorRT (`trt_time`) results in a slight performance improvement, reducing inference time for 0.93%.\n", + "\n", + "- TensorRT + GPU Transforms Improvement:\n", + "Incorporating GPU transforms (`trt_gpu_transforms_time`) further reduces the inference time by 9.32%.\n", + "\n", + "- TensorRT + GDS + GPU Transforms Improvement:\n", + "The combination of GPU Direct Storage and GPU transforms (`trt_gds_gpu_transforms_time`) provides the most substantial improvement, reducing more than 55% of the inference time compared to the original model." ] }, { "cell_type": "code", - "execution_count": null, + "execution_count": 13, "metadata": {}, - "outputs": [], + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "original_time 360.122527\n", + "trt_time 356.739906\n", + "trt_gpu_transforms_time 326.563954\n", + "trt_gds_gpu_transforms_time 160.416928\n", + "dtype: float64\n" + ] + } + ], + "source": [ + "print(total_time)" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "TensorRT Improvement: 0.009392972563697605\n", + "TensorRT + GPU Transforms Improvement: 0.09318654129529037\n", + "TensorRT + GDS + GPU Transforms Improvement: 0.5545490328713701\n" + ] + } + ], + "source": [ + "print(\"TensorRT Improvement: \", (total_time[\"original_time\"] - total_time[\"trt_time\"]) / total_time[\"original_time\"])\n", + "print(\"TensorRT + GPU Transforms Improvement: \", (total_time[\"original_time\"] - total_time[\"trt_gpu_transforms_time\"]) / total_time[\"original_time\"])\n", + "print(\"TensorRT + GDS + GPU Transforms Improvement: \", (total_time[\"original_time\"] - total_time[\"trt_gds_gpu_transforms_time\"]) / total_time[\"original_time\"])" + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA90AAAJOCAYAAACqS2TfAAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjEsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvc2/+5QAAAAlwSFlzAAAPYQAAD2EBqD+naQAAmP5JREFUeJzs3XdUFOfbxvFr6aKCjWLvUbHEblBj7L1FbNFEjYoNe9cYSxJLNDEaNdbYe2+JYq/B3mvUGDUqdkFRENh9//BlfxBLEFkX8Ps5Z89hn5mdvWfZ2d1rnplnDCaTySQAAAAAABDvbKxdAAAAAAAASRWhGwAAAAAACyF0AwAAAABgIYRuAAAAAAAshNANAAAAAICFELoBAAAAALAQQjcAAAAAABZC6AYAAAAAwEII3QAAAAAAWAihGwASkR07dshgMGjHjh0WfZ7Hjx+rbdu28vT0lMFgUPfu3S36fEnBu/rfvImIiAj17dtXmTNnlo2NjerXr2/tkuJd1Ou+fPlya5diMYltHYcOHSqDwaC7d+9auxQASBAI3QDwHwwGQ6xusQlbI0aM0OrVqy1e8+zZs2UwGHTo0KE4PX7EiBGaPXu2OnbsqHnz5umLL76I5woTh1atWsXqf9+qVStrl/pSM2fO1JgxY9SwYUPNmTNHPXr0sOjzlS9f/pWvUd68eS363HEVta1Ev7m7u6tChQrasGGDtct7r8TnZy0AJCR21i4AABK6efPmxbg/d+5cbd68+YX2fPny/eeyRowYoYYNGyb4Hsdt27bpo48+0pAhQ6xdilW1b99elStXNt+/fPmyBg8erHbt2unjjz82t+fMmVOlSpXS06dP5eDgYI1SX2rbtm3KmDGjfvrpp3f2nJkyZdLIkSNfaHd1dX1nNcTFN998o+zZs8tkMunWrVuaPXu2atasqXXr1ql27drWLu+9EJ+ftQCQkBC6AeA/fP755zHu79u3T5s3b36hPSm5ffu2vLy84m15RqNRz549k5OTU7wt813w9vaWt7e3+f6hQ4c0ePBgeXt7v/T/n9DW7/bt20qVKlW8LS82/0dXV9dEuW3UqFFDxYsXN99v06aNPDw8tGjRIkJ3LIWEhCh58uRxfvz7+FkL4P3A4eUAEA9CQkLUq1cvZc6cWY6OjsqTJ49++OEHmUwm8zwGg0EhISGaM2fOC4clX7lyRZ06dVKePHmULFkypU2bVo0aNdLff/8dbzW2atVKKVKk0PXr11W/fn2lSJFCbm5u6t27tyIjIyX979zRy5cv67fffjPXGVVHWFiYhgwZoly5csnR0VGZM2dW3759FRYWFuO5DAaDOnfurAULFih//vxydHTUxo0bJUnXr19X69at5eHhIUdHR+XPn18zZ86M8fioOpYuXarhw4crU6ZMcnJyUqVKlXTx4sUX1m3//v2qWbOmUqdOreTJk6tQoUIaP358jHnOnTunhg0bKk2aNHJyclLx4sW1du3a+Hp5X3pOd/ny5VWgQAGdOHFCn3zyiZydnZUrVy7zubk7d+5UqVKllCxZMuXJk0dbtmx5Ybmxeb3+7e+//5bBYND27dt1+vTpFw7Ljc37VXr9//FtvMn7/eHDh+rRo4eyZcsmR0dHZcqUSS1atHjhfGGj0Rir90pspUqVSsmSJZOdXcz+CaPRqHHjxil//vxycnKSh4eH2rdvrwcPHsSYL1u2bKpdu7b27NmjkiVLysnJSTly5NDcuXMtuo5v+56L7f8m6rD8nTt3qlOnTnJ3d1emTJle+XpeuXJFuXLlUoECBXTr1q1Xzvc6LVu2VLp06RQeHv7CtKpVqypPnjzm+9Hfu3ny5JGTk5OKFSumXbt2vfDYuGxjAPAm6OkGgLdkMplUt25dbd++XW3atFHhwoXl7++vPn366Pr16+ZDe+fNm6e2bduqZMmSateunaTnhyVL0sGDB/XHH3+oadOmypQpk/7++29NnjxZ5cuX15kzZ+Ts7BwvtUZGRqpatWoqVaqUfvjhB23ZskU//vijcubMqY4dOypfvnyaN2+eevTooUyZMqlXr16SJDc3NxmNRtWtW1d79uxRu3btlC9fPp08eVI//fST/vzzzxfOVd+2bZuWLl2qzp07K126dMqWLZtu3bqljz76yPyD2M3NTRs2bFCbNm0UHBz8woBto0aNko2NjXr37q2goCCNHj1azZs31/79+83zbN68WbVr11b69OnVrVs3eXp66uzZs1q/fr26desmSTp9+rTKlCmjjBkzqn///kqePLmWLl2q+vXra8WKFfr000/j5fV9mQcPHqh27dpq2rSpGjVqpMmTJ6tp06ZasGCBunfvrg4dOqhZs2bmc6+vXbumlClTStIbv15R3NzcNG/ePA0fPlyPHz82H+6dL1++WL9fo7zs//g6kZGRLx1AK1myZOZe0Ni+3x8/fqyPP/5YZ8+eVevWrVW0aFHdvXtXa9eu1T///KN06dKZlx+b98rrBAUF6e7duzKZTLp9+7YmTJigx48fv9DL2r59e82ePVtffvmlunbtqsuXL2vixIk6evSo9u7dK3t7e/O8Fy9eVMOGDdWmTRu1bNlSM2fOVKtWrVSsWDHlz5/fYuv4Nu+5N/0s6tSpk9zc3DR48GCFhIS89LW9dOmSKlasqDRp0mjz5s0x1ulNfPHFF5o7d678/f1jHH0QGBiobdu2vXA6zM6dO7VkyRJ17dpVjo6O+uWXX1S9enUdOHBABQoUkBT3bQwA3ogJAPBG/Pz8TNE/PlevXm2SZPruu+9izNewYUOTwWAwXbx40dyWPHlyU8uWLV9Y5pMnT15oCwgIMEkyzZ0719y2fft2kyTT9u3bX1vjrFmzTJJMBw8eNLe1bNnSJMn0zTffxJi3SJEipmLFisVoy5o1q6lWrVox2ubNm2eysbEx7d69O0b7lClTTJJMe/fuNbdJMtnY2JhOnz4dY942bdqY0qdPb7p7926M9qZNm5pcXV3Nr0PUeubLl88UFhZmnm/8+PEmSaaTJ0+aTCaTKSIiwpQ9e3ZT1qxZTQ8ePIixTKPRaP67UqVKpoIFC5pCQ0NjTC9durQpd+7cptg6ePCgSZJp1qxZL0x72f/mk08+MUkyLVy40Nx27tw58+uzb98+c7u/v/8Ly47t6/Uqn3zyiSl//vwx2t7k/fqq/+Prnk/SS2/t27c3zxfb9/vgwYNNkkwrV658Yf6o/29s3yuvErWt/Pvm6Ohomj17dox5d+/ebZJkWrBgQYz2jRs3vtCeNWtWkyTTrl27zG23b982OTo6mnr16mWxdXzb91xs/zdRr1vZsmVNERERMeYfMmSISZLpzp07prNnz5oyZMhgKlGihOn+/fsvLPt1/v1ZGxkZacqUKZOpSZMmMeYbO3asyWAwmP766y9zW9T/8dChQ+a2K1eumJycnEyffvqpue1ttzEAiA0OLweAt/T777/L1tZWXbt2jdHeq1cvmUymWI2AnCxZMvPf4eHhunfvnnLlyqVUqVLpyJEj8Vpvhw4dYtz/+OOP9ddff/3n45YtW6Z8+fIpb968unv3rvlWsWJFSdL27dtjzP/JJ5/EOC/cZDJpxYoVqlOnjkwmU4xlVKtWTUFBQS+s65dffhljYLKowcui6j169KguX76s7t27v3DussFgkCTdv39f27ZtU+PGjfXo0SPzc967d0/VqlXThQsXdP369f9c/7hKkSKFmjZtar6fJ08epUqVSvny5VOpUqXM7VF/R61bXF6v2HjT9+u//4//JVu2bNq8efMLt+g9hrF9v69YsUIffvjhS49EiPr/Rvmv98p/mTRpkrnW+fPnq0KFCmrbtq1WrlxpnmfZsmVydXVVlSpVYvw/ihUrphQpUrywDXh5ecUYcM/NzU158uSJUZMl1jGu7znpzT+LfH19ZWtr+0K7JJ06dUqffPKJsmXLpi1btih16tQvnS+2bGxs1Lx5c61du1aPHj0yty9YsEClS5dW9uzZY8zv7e2tYsWKme9nyZJF9erVk7+/vyIjIy22jQHAv3F4OQC8pStXrihDhgzmwzOjRI2we+XKlf9cxtOnTzVy5EjNmjVL169fj3FubVBQULzV6uTkJDc3txhtqVOnfuF81Je5cOGCzp49+8Ljo9y+fTvG/X//AL5z544ePnyoadOmadq0abFaRpYsWV6oVZK53kuXLkmS+VDRl7l48aJMJpO+/vprff3116983owZM75yGW8jU6ZML4QnV1dXZc6c+YU26X/rFpfXKzbe9P367//jf0mePHmMEd9fJrbv90uXLsnHxydWz/tf75X/UrJkyRgDqX322WcqUqSIOnfurNq1a8vBwUEXLlxQUFCQ3N3dX7qM/3r/RtUVvSZLrGNc33PSm38Wve79UadOHXl4eMjf318pUqR43arFWosWLfT9999r1apVatGihc6fP6/Dhw9rypQpL8ybO3fuF9o++OADPXnyRHfu3JGNjY1FtjEA+DdCNwAkAF26dNGsWbPUvXt3eXt7y9XVVQaDQU2bNpXRaIy353lVj1RsGI1GFSxYUGPHjn3p9H//oI/eYxb1eOn5CMUtW7Z86TIKFSoU4/6r6o0eBP5L1PP27t1b1apVe+k8uXLlivXy3tSr1uG/1i0ur5cl/Pv/GB8s8X6Pj/dKdDY2NqpQoYLGjx+vCxcuKH/+/DIajXJ3d9eCBQte+ph/75CK75piu7y4vuekN//fvO794ePjozlz5mjBggVq3779K+d7E15eXipWrJjmz5+vFi1aaP78+XJwcFDjxo3feFkJZRsDkPQRugHgLWXNmlVbtmzRo0ePYvQenjt3zjw9yr97n6IsX75cLVu21I8//mhuCw0N1cOHDy1TdBzkzJlTx48fV6VKlV65Hq/j5uamlClTKjIy8j97Qt+kJun5YayvWmaOHDkkSfb29vH2vO+CJV4v6c3er5YS2/d7zpw5derUKYvX8yoRERGSng92FlXPli1bVKZMmXjbGWHtdfy3+PwsGjNmjOzs7NSpUyelTJlSzZo1i5caW7RooZ49e+rmzZtauHChatWq9dJD1y9cuPBC259//ilnZ2fzDhJLbGMA8G+c0w0Ab6lmzZqKjIzUxIkTY7T/9NNPMhgMqlGjhrktefLkL/3xamtr+0Jv1YQJE8yX8koIGjdurOvXr2v69OkvTHv69OkrRy6OYmtrKx8fH61YseKlIePOnTtvXFPRokWVPXt2jRs37oXXNer1dHd3V/ny5TV16lTdvHkzXp73XbDE6yW92fvVUmL7fvfx8dHx48e1atWqF5YR197i2AoPD9emTZvk4OBgPvS+cePGioyM1LfffvvC/BEREXEKptZcx5eJz88ig8GgadOmqWHDhmrZsmW8XaLvs88+k8FgULdu3fTXX3+98jreAQEBMc7JvnbtmtasWaOqVavK1tbWYtsYAPwbPd0A8Jbq1KmjChUq6KuvvtLff/+tDz/8UJs2bdKaNWvUvXt3c2+sJBUrVkxbtmzR2LFjlSFDBmXPnl2lSpVS7dq1NW/ePLm6usrLy0sBAQHasmWL0qZNa8U1i+mLL77Q0qVL1aFDB23fvl1lypRRZGSkzp07p6VLl8rf3z/GObEvM2rUKG3fvl2lSpWSr6+vvLy8dP/+fR05ckRbtmzR/fv336gmGxsbTZ48WXXq1FHhwoX15ZdfKn369Dp37pxOnz4tf39/Sc8HySpbtqwKFiwoX19f5ciRQ7du3VJAQID++ecfHT9+PM6viyXF9+slvdn7NS6CgoI0f/78l06LCkexfb/36dNHy5cvV6NGjdS6dWsVK1ZM9+/f19q1azVlyhR9+OGHb1VrdBs2bDD39t++fVsLFy7UhQsX1L9/f7m4uEh6Pqhc+/btNXLkSB07dkxVq1aVvb29Lly4oGXLlmn8+PFq2LDhGz3vu1zH2IjvzyIbGxvNnz9f9evXV+PGjfX777+bB1+MKzc3N1WvXl3Lli1TqlSpVKtWrZfOV6BAAVWrVi3GJcMkadiwYeZ5LLGNAcC/EboB4C3Z2Nho7dq1Gjx4sJYsWaJZs2YpW7ZsGjNmjPk611HGjh2rdu3aadCgQXr69KlatmypUqVKafz48bK1tdWCBQsUGhqqMmXKaMuWLa88B9kabGxstHr1av3000+aO3euVq1aJWdnZ+XIkUPdunXTBx988J/L8PDw0IEDB/TNN99o5cqV+uWXX5Q2bVrlz59f33//fZzqqlatmrZv365hw4bpxx9/lNFoVM6cOeXr62uex8vLS4cOHdKwYcM0e/Zs3bt3T+7u7ipSpIgGDx4cp+d9Fyzxer3J+zUu/vnnH33xxRcvnRYVumP7fk+RIoV2796tIUOGaNWqVZozZ47c3d1VqVIlZcqU6a1rjS76+8DJyUl58+bV5MmTXzgXecqUKSpWrJimTp2qgQMHys7OTtmyZdPnn3+uMmXKvPHzvst1jA1LfBbZ29tr+fLlqlGjhurVq6ctW7bEGEU9Llq0aKH169ercePGcnR0fOk8n3zyiby9vTVs2DBdvXpVXl5emj17dozztC2xjQHAvxlM1jh2CQAAAIijNWvWqH79+tq1a1eMy7JFMRgM8vPze+E0CgCwBs7pBgAAQKIyffp05ciRQ2XLlrV2KQDwnzi8HAAAAInC4sWLdeLECf32228aP358nK6kAADvGqEbAAAAicJnn32mFClSqE2bNurUqZO1ywGAWOGcbgAAAAAALIRzugEAAAAAsBBCNwAAAAAAFsI53ZKMRqNu3LihlClTMiAHAAAAAOA/mUwmPXr0SBkyZJCNzav7swndkm7cuKHMmTNbuwwAAAAAQCJz7do1ZcqU6ZXTCd2SUqZMKen5i+Xi4mLlagAAAAAACV1wcLAyZ85szpOvQuiWzIeUu7i4ELoBAAAAALH2X6coM5AaAAAAAAAWQugGAAAAAMBCCN0AAAAAAFgIoRsAAAAAAAshdAMAAAAAYCGEbgAAAAAALITQDQAAAACAhRC6AQAAAACwEEI3AAAAAAAWQugGAAAAAMBCCN0AAAAAAFgIoRsAAAAAAAshdAMAAAAAYCGEbgAAAAAALMTO2gXg3Rl19K61S0A86F8knbVLAAAAABBL9HQDAAAAAGAhhG4AAAAAACyE0A0AAAAAgIVwTjcAWMNCg7UrQHxpZrJ2BQAAIAGjpxsAAAAAAAuxauiePHmyChUqJBcXF7m4uMjb21sbNmwwTy9fvrwMBkOMW4cOHWIs4+rVq6pVq5acnZ3l7u6uPn36KCIi4l2vCgAAAAAAL7Dq4eWZMmXSqFGjlDt3bplMJs2ZM0f16tXT0aNHlT9/fkmSr6+vvvnmG/NjnJ2dzX9HRkaqVq1a8vT01B9//KGbN2+qRYsWsre314gRI975+gAAAAAAEJ1VQ3edOnVi3B8+fLgmT56sffv2mUO3s7OzPD09X/r4TZs26cyZM9qyZYs8PDxUuHBhffvtt+rXr5+GDh0qBwcHi68DAAAAAACvkmDO6Y6MjNTixYsVEhIib29vc/uCBQuULl06FShQQAMGDNCTJ0/M0wICAlSwYEF5eHiY26pVq6bg4GCdPn36lc8VFham4ODgGDcAAAAAAOKb1UcvP3nypLy9vRUaGqoUKVJo1apV8vLykiQ1a9ZMWbNmVYYMGXTixAn169dP58+f18qVKyVJgYGBMQK3JPP9wMDAVz7nyJEjNWzYMAutEQAAAAAAz1k9dOfJk0fHjh1TUFCQli9frpYtW2rnzp3y8vJSu3btzPMVLFhQ6dOnV6VKlXTp0iXlzJkzzs85YMAA9ezZ03w/ODhYmTNnfqv1AAAAAADg36x+eLmDg4Ny5cqlYsWKaeTIkfrwww81fvz4l85bqlQpSdLFixclSZ6enrp161aMeaLuv+o8cElydHQ0j5gedQMAAAAAIL5ZPXT/m9FoVFhY2EunHTt2TJKUPn16SZK3t7dOnjyp27dvm+fZvHmzXFxczIeoAwAAAABgLVY9vHzAgAGqUaOGsmTJokePHmnhwoXasWOH/P39denSJS1cuFA1a9ZU2rRpdeLECfXo0UPlypVToUKFJElVq1aVl5eXvvjiC40ePVqBgYEaNGiQ/Pz85OjoaM1VAwAAAADAuqH79u3batGihW7evClXV1cVKlRI/v7+qlKliq5du6YtW7Zo3LhxCgkJUebMmeXj46NBgwaZH29ra6v169erY8eO8vb2VvLkydWyZcsY1/UGAAAAAMBarBq6f/3111dOy5w5s3bu3Pmfy8iaNat+//33+CwLAAAAAIB4keDO6QYAAAAAIKkgdAMAAAAAYCGEbgAAAAAALITQDQAAAACAhRC6AQAAAACwEEI3AAAAAAAWQugGAAAAAMBCCN0AAAAAAFgIoRsAAAAAAAshdAMAAAAAYCGEbgAAAAAALITQDQAAAACAhRC6AQAAAACwEEI3AAAAAAAWQugGAAAAAMBC7KxdAAAAgLUZhhmsXQLigWmIydolAMAL6OkGAAAAAMBCCN0AAAAAAFgIoRsAAAAAAAshdAMAAAAAYCGEbgAAAAAALITQDQAAAACAhRC6AQAAAACwEEI3AAAAAAAWQugGAAAAAMBCCN0AAAAAAFgIoRsAAAAAAAshdAMAAAAAYCGEbgAAAAAALITQDQAAAACAhRC6AQAAAACwEEI3AAAAAAAWQugGAAAAAMBCCN0AAAAAAFgIoRsAAAAAAAshdAMAAAAAYCFWDd2TJ09WoUKF5OLiIhcXF3l7e2vDhg3m6aGhofLz81PatGmVIkUK+fj46NatWzGWcfXqVdWqVUvOzs5yd3dXnz59FBER8a5XBQAAAACAF1g1dGfKlEmjRo3S4cOHdejQIVWsWFH16tXT6dOnJUk9evTQunXrtGzZMu3cuVM3btxQgwYNzI+PjIxUrVq19OzZM/3xxx+aM2eOZs+ercGDB1trlQAAAAAAMDOYTCaTtYuILk2aNBozZowaNmwoNzc3LVy4UA0bNpQknTt3Tvny5VNAQIA++ugjbdiwQbVr19aNGzfk4eEhSZoyZYr69eunO3fuyMHBIVbPGRwcLFdXVwUFBcnFxcVi62Zto47etXYJiAf9i6SzdgmIDwsN1q4A8aVZgvoaRRwZhrFNJgWmIWyPAN6d2ObIBHNOd2RkpBYvXqyQkBB5e3vr8OHDCg8PV+XKlc3z5M2bV1myZFFAQIAkKSAgQAULFjQHbkmqVq2agoODzb3lLxMWFqbg4OAYNwAAAAAA4pvVQ/fJkyeVIkUKOTo6qkOHDlq1apW8vLwUGBgoBwcHpUqVKsb8Hh4eCgwMlCQFBgbGCNxR06OmvcrIkSPl6upqvmXOnDl+VwoAAAAAACWA0J0nTx4dO3ZM+/fvV8eOHdWyZUudOXPGos85YMAABQUFmW/Xrl2z6PMBAAAAAN5PdtYuwMHBQbly5ZIkFStWTAcPHtT48ePVpEkTPXv2TA8fPozR233r1i15enpKkjw9PXXgwIEYy4sa3TxqnpdxdHSUo6NjPK8JAAAAAAAxWb2n+9+MRqPCwsJUrFgx2dvba+vWreZp58+f19WrV+Xt7S1J8vb21smTJ3X79m3zPJs3b5aLi4u8vLzeee0AAAAAAERn1Z7uAQMGqEaNGsqSJYsePXqkhQsXaseOHfL395erq6vatGmjnj17Kk2aNHJxcVGXLl3k7e2tjz76SJJUtWpVeXl56YsvvtDo0aMVGBioQYMGyc/Pj55sAAAAAIDVWTV03759Wy1atNDNmzfl6uqqQoUKyd/fX1WqVJEk/fTTT7KxsZGPj4/CwsJUrVo1/fLLL+bH29raav369erYsaO8vb2VPHlytWzZUt988421VgkAAAAAALMEd51ua+A63UhMuE53EsF1upMOrtOdJHCd7qSB63QDeJcS3XW6AQAAAABIagjdAAAAAABYCKEbAAAAAAALIXQDAAAAAGAhhG4AAAAAACyE0A0AAAAAgIUQugEAAAAAsBBCNwAAAAAAFkLoBgAAAADAQgjdAAAAAABYCKEbAAAAAAALIXQDAAAAAGAhhG4AAAAAACyE0A0AAAAAgIUQugEAAAAAsBBCNwAAAAAAFkLoBgAAAADAQgjdAAAAAABYCKEbAAAAAAALIXQDAAAAAGAhhG4AAAAAACyE0A0AAAAAgIUQugEAAAAAsBBCNwAAAAAAFkLoBgAAAADAQgjdAAAAAABYCKEbAAAAAAALIXQDAAAAAGAhhG4AAAAAACzELi4Punz5snbv3q0rV67oyZMncnNzU5EiReTt7S0nJ6f4rhEAAAAAgETpjUL3ggULNH78eB06dEgeHh7KkCGDkiVLpvv37+vSpUtycnJS8+bN1a9fP2XNmtVSNQMAAAAAkCjEOnQXKVJEDg4OatWqlVasWKHMmTPHmB4WFqaAgAAtXrxYxYsX1y+//KJGjRrFe8EAAAAAACQWsQ7do0aNUrVq1V453dHRUeXLl1f58uU1fPhw/f333/FRHwAAAAAAiVasQ/frAve/pU2bVmnTpo1TQQAAAAAAJBVxGr38yJEjOnnypPn+mjVrVL9+fQ0cOFDPnj2Lt+IAAAAAAEjM4hS627dvrz///FOS9Ndff6lp06ZydnbWsmXL1Ldv33gtEAAAAACAxCpOofvPP/9U4cKFJUnLli1TuXLltHDhQs2ePVsrVqyI9XJGjhypEiVKKGXKlHJ3d1f9+vV1/vz5GPOUL19eBoMhxq1Dhw4x5rl69apq1aolZ2dnubu7q0+fPoqIiIjLqgEAAAAAEG/idJ1uk8kko9EoSdqyZYtq164tScqcObPu3r0b6+Xs3LlTfn5+KlGihCIiIjRw4EBVrVpVZ86cUfLkyc3z+fr66ptvvjHfd3Z2Nv8dGRmpWrVqydPTU3/88Ydu3rypFi1ayN7eXiNGjIjL6gEAAAAAEC/iFLqLFy+u7777TpUrV9bOnTs1efJkSdLly5fl4eER6+Vs3Lgxxv3Zs2fL3d1dhw8fVrly5cztzs7O8vT0fOkyNm3apDNnzmjLli3y8PBQ4cKF9e2336pfv34aOnSoHBwc4rCGAAAAAAC8vTgdXj5u3DgdOXJEnTt31ldffaVcuXJJkpYvX67SpUvHuZigoCBJUpo0aWK0L1iwQOnSpVOBAgU0YMAAPXnyxDwtICBABQsWjBH2q1WrpuDgYJ0+fTrOtQAAAAAA8Lbi1NNdqFChGKOXRxkzZoxsbW3jVIjRaFT37t1VpkwZFShQwNzerFkzZc2aVRkyZNCJEyfUr18/nT9/XitXrpQkBQYGvtC7HnU/MDDwpc8VFhamsLAw8/3g4OA41QwAAAAAwOvEKXS/ipOTU5wf6+fnp1OnTmnPnj0x2tu1a2f+u2DBgkqfPr0qVaqkS5cuKWfOnHF6rpEjR2rYsGFxrhUAAAAAgNiIdehOnTq1DAZDrOa9f//+GxXRuXNnrV+/Xrt27VKmTJleO2+pUqUkSRcvXlTOnDnl6empAwcOxJjn1q1bkvTK88AHDBignj17mu8HBwcrc+bMb1QzAAAAAAD/Jdahe9y4cea/7927p++++07VqlWTt7e3pOfnVvv7++vrr7+O9ZObTCZ16dJFq1at0o4dO5Q9e/b/fMyxY8ckSenTp5ckeXt7a/jw4bp9+7bc3d0lSZs3b5aLi4u8vLxeugxHR0c5OjrGuk4AAAAAAOIi1qG7ZcuW5r99fHz0zTffqHPnzua2rl27auLEidqyZYt69OgRq2X6+flp4cKFWrNmjVKmTGk+B9vV1VXJkiXTpUuXtHDhQtWsWVNp06bViRMn1KNHD5UrV06FChWSJFWtWlVeXl764osvNHr0aAUGBmrQoEHy8/MjWAMAAAAArCpOo5f7+/urevXqL7RXr15dW7ZsifVyJk+erKCgIJUvX17p06c335YsWSJJcnBw0JYtW1S1alXlzZtXvXr1ko+Pj9atW2dehq2trdavXy9bW1t5e3vr888/V4sWLWJc1xsAAAAAAGuI00BqadOm1Zo1a9SrV68Y7WvWrFHatGljvRyTyfTa6ZkzZ9bOnTv/czlZs2bV77//HuvnBQAAAADgXYhT6B42bJjatm2rHTt2mAc2279/vzZu3Kjp06fHa4EAAAAAACRWcQrdrVq1Ur58+fTzzz+br5edL18+7dmzxxzCAQAAAAB438X5Ot2lSpXSggUL4rMWAAAAAACSlDiHbqPRqIsXL+r27dsyGo0xppUrV+6tCwMAAAAAILGLU+jet2+fmjVrpitXrrwwGJrBYFBkZGS8FAcAAAAAQGIWp9DdoUMHFS9eXL/99pvSp08vg8EQ33UBAAAAAJDoxSl0X7hwQcuXL1euXLniux4AAAAAAJIMm7g8qFSpUrp48WJ81wIAAAAAQJISp57uLl26qFevXgoMDFTBggVlb28fY3qhQoXipTgAAAAAABKzOIVuHx8fSVLr1q3NbQaDQSaTiYHUAAAAAAD4f3EK3ZcvX47vOgAAAAAASHLiFLqzZs0a33UAAAAAAJDkxCl0S9KlS5c0btw4nT17VpLk5eWlbt26KWfOnPFWHAAAAAAAiVmcRi/39/eXl5eXDhw4oEKFCqlQoULav3+/8ufPr82bN8d3jQAAAAAAJEpx6unu37+/evTooVGjRr3Q3q9fP1WpUiVeigMAAAAAIDGLU0/32bNn1aZNmxfaW7durTNnzrx1UQAAAAAAJAVxCt1ubm46duzYC+3Hjh2Tu7v729YEAAAAAECSEKfDy319fdWuXTv99ddfKl26tCRp7969+v7779WzZ894LRAAAAAAgMQqTqH766+/VsqUKfXjjz9qwIABkqQMGTJo6NCh6tq1a7wWCAAAAABAYhWn0G0wGNSjRw/16NFDjx49kiSlTJkyXgsDAAAAACCxi1Povnz5siIiIpQ7d+4YYfvChQuyt7dXtmzZ4qs+AAAAAAASrTgNpNaqVSv98ccfL7Tv379frVq1etuaAAAAAABIEuIUuo8ePaoyZcq80P7RRx+9dFRzAAAAAADeR3EK3QaDwXwud3RBQUGKjIx866IAAAAAAEgK4hS6y5Urp5EjR8YI2JGRkRo5cqTKli0bb8UBAAAAAJCYxWkgte+//17lypVTnjx59PHHH0uSdu/ereDgYG3bti1eCwQAAAAAILGKU0+3l5eXTpw4ocaNG+v27dt69OiRWrRooXPnzqlAgQLxXSMAAAAAAIlSnHq6JSlDhgwaMWJEfNYCAAAAAECSEqeebun54eSff/65SpcurevXr0uS5s2bpz179sRbcQAAAAAAJGZxCt0rVqxQtWrVlCxZMh05ckRhYWGSno9eTu83AAAAAADPxSl0f/fdd5oyZYqmT58ue3t7c3uZMmV05MiReCsOAAAAAIDELE6h+/z58ypXrtwL7a6urnr48OHb1gQAAAAAQJIQp9Dt6empixcvvtC+Z88e5ciR462LAgAAAAAgKYhT6Pb19VW3bt20f/9+GQwG3bhxQwsWLFDv3r3VsWPH+K4RAAAAAIBEKU6XDOvfv7+MRqMqVaqkJ0+eqFy5cnJ0dFTv3r3VpUuX+K4RAAAAAIBEKU6h22Aw6KuvvlKfPn108eJFPX78WF5eXkqRIkV81wcAAAAAQKIV5+t0S5KDg4O8vLyUN29ebdmyRWfPno2vugAAAAAASPTiFLobN26siRMnSpKePn2qEiVKqHHjxipUqJBWrFgR6+WMHDlSJUqUUMqUKeXu7q769evr/PnzMeYJDQ2Vn5+f0qZNqxQpUsjHx0e3bt2KMc/Vq1dVq1YtOTs7y93dXX369FFERERcVg0AAAAAgHgTp9C9a9cuffzxx5KkVatWyWg06uHDh/r555/13XffxXo5O3fulJ+fn/bt26fNmzcrPDxcVatWVUhIiHmeHj16aN26dVq2bJl27typGzduqEGDBubpkZGRqlWrlp49e6Y//vhDc+bM0ezZszV48OC4rBoAAAAAAPHGYDKZTG/6oGTJkunPP/9U5syZ1aJFC2XIkEGjRo3S1atX5eXlpcePH8epmDt37sjd3V07d+5UuXLlFBQUJDc3Ny1cuFANGzaUJJ07d0758uVTQECAPvroI23YsEG1a9fWjRs35OHhIUmaMmWK+vXrpzt37sjBweE/nzc4OFiurq4KCgqSi4tLnGpPDEYdvWvtEhAP+hdJZ+0SEB8WGqxdAeJLszf+GkUCZBjGNpkUmIawPQJ4d2KbI+PU0505c2YFBAQoJCREGzduVNWqVSVJDx48kJOTU9wqlhQUFCRJSpMmjSTp8OHDCg8PV+XKlc3z5M2bV1myZFFAQIAkKSAgQAULFjQHbkmqVq2agoODdfr06Zc+T1hYmIKDg2PcAAAAAACIb3EK3d27d1fz5s2VKVMmZciQQeXLl5f0/LDzggULxqkQo9Go7t27q0yZMipQoIAkKTAwUA4ODkqVKlWMeT08PBQYGGieJ3rgjpoeNe1lRo4cKVdXV/Mtc+bMcaoZAAAAAIDXidMlwzp16qRSpUrp6tWrqlKlimxsnmf3HDlyvNE53dH5+fnp1KlT2rNnT5we/yYGDBignj17mu8HBwcTvAEAAAAA8S5OoVuSihUrpmLFisVoq1WrVpyW1blzZ61fv167du1SpkyZzO2enp569uyZHj58GKO3+9atW/L09DTPc+DAgRjLixrdPGqef3N0dJSjo2OcagUAAAAAILZifXj5qFGj9PTp01jNu3//fv3222//OZ/JZFLnzp21atUqbdu2TdmzZ48xvVixYrK3t9fWrVvNbefPn9fVq1fl7e0tSfL29tbJkyd1+/Zt8zybN2+Wi4uLvLy8YlUvAAAAAACWEOue7jNnzihLlixq1KiR6tSpo+LFi8vNzU2SFBERoTNnzmjPnj2aP3++bty4oblz5/7nMv38/LRw4UKtWbNGKVOmNJ+D7erqqmTJksnV1VVt2rRRz549lSZNGrm4uKhLly7y9vbWRx99JEmqWrWqvLy89MUXX2j06NEKDAzUoEGD5OfnR282AAAAAMCqYh26586dq+PHj2vixIlq1qyZgoODZWtrK0dHRz158kSSVKRIEbVt21atWrWK1SjmkydPliTzQGxRZs2apVatWkmSfvrpJ9nY2MjHx0dhYWGqVq2afvnlF/O8tra2Wr9+vTp27Chvb28lT55cLVu21DfffBPbVQMAAAAAwCLidJ1uo9GoEydO6MqVK3r69KnSpUunwoULK126xHn9YK7TjcSE63QnEVynO+ngOt1JAtfpThq4TjeAdym2OTJOA6nZ2NiocOHCKly4cFzrAwAAAAAgyYvTdboBAAAAAMB/I3QDAAAAAGAhhG4AAAAAACyE0A0AAAAAgIW8Vei+ePGi/P399fTpU0lSHAZCBwAAAAAgyYpT6L53754qV66sDz74QDVr1tTNmzclSW3atFGvXr3itUAAAAAAABKrOIXuHj16yM7OTlevXpWzs7O5vUmTJtq4cWO8FQcAAAAAQGIWp+t0b9q0Sf7+/sqUKVOM9ty5c+vKlSvxUhgAAAAAAIldnHq6Q0JCYvRwR7l//74cHR3fuigAAAAAAJKCOIXujz/+WHPnzjXfNxgMMhqNGj16tCpUqBBvxQEAAAAAkJjF6fDy0aNHq1KlSjp06JCePXumvn376vTp07p//7727t0b3zUCAAAAAJAoxamnu0CBAvrzzz9VtmxZ1atXTyEhIWrQoIGOHj2qnDlzxneNAAAAAAAkSnHq6ZYkV1dXffXVV/FZCwAAAAAASUqcQ3doaKhOnDih27dvy2g0xphWt27dty4MAAAAAIDELk6he+PGjWrRooXu3r37wjSDwaDIyMi3LgwAAAAAgMQuTud0d+nSRY0aNdLNmzdlNBpj3AjcAAAAAAA8F6fQfevWLfXs2VMeHh7xXQ8AAAAAAElGnEJ3w4YNtWPHjnguBQAAAACApCVO53RPnDhRjRo10u7du1WwYEHZ29vHmN61a9d4KQ4AAAAAgMQsTqF70aJF2rRpk5ycnLRjxw4ZDAbzNIPBQOgGAAAAAEBxDN1fffWVhg0bpv79+8vGJk5HqAMAAAAAkOTFKTE/e/ZMTZo0IXADAAAAAPAacUrNLVu21JIlS+K7FgAAAAAAkpQ4HV4eGRmp0aNHy9/fX4UKFXphILWxY8fGS3EAAAAAACRmcQrdJ0+eVJEiRSRJp06dijEt+qBqAAAAAAC8z+IUurdv3x7fdQAAAAAAkOQwEhoAAAAAABYS657uBg0aaPbs2XJxcVGDBg1eO+/KlSvfujAAAAAAABK7WIduV1dX8/narq6uFisIAAAAAICkItahe9asWfrmm2/Uu3dvzZo1y5I1AQAAAACQJLzROd3Dhg3T48ePLVULAAAAAEgGA7ekcIOkNwzdJpPJUnUAAAAAAJDkvPHo5VyHGwAAAACA2Hnj63R/8MEH/xm879+/H+eCAAAAAABIKt44dA8bNozRywEAAAAAiIU3Dt1NmzaVu7t7vDz5rl27NGbMGB0+fFg3b97UqlWrVL9+ffP0Vq1aac6cOTEeU61aNW3cuNF8//79++rSpYvWrVsnGxsb+fj4aPz48UqRIkW81AgAAAAAQFy90Tnd8X0+d0hIiD788ENNmjTplfNUr15dN2/eNN8WLVoUY3rz5s11+vRpbd68WevXr9euXbvUrl27eK0TAAAAAIC4eKOe7vgevbxGjRqqUaPGa+dxdHSUp6fnS6edPXtWGzdu1MGDB1W8eHFJ0oQJE1SzZk398MMPypAhQ7zWCwAAAADAm3ijnm6j0Rhvh5bH1o4dO+Tu7q48efKoY8eOunfvnnlaQECAUqVKZQ7cklS5cmXZ2Nho//7977ROAAAAAAD+7Y3P6X6XqlevrgYNGih79uy6dOmSBg4cqBo1aiggIEC2trYKDAx8YSeAnZ2d0qRJo8DAwFcuNywsTGFhYeb7wcHBFlsHAAAAAMD7K0GH7qZNm5r/LliwoAoVKqScOXNqx44dqlSpUpyXO3LkSA0bNiw+SgQAAAAA4JXe6PBya8uRI4fSpUunixcvSpI8PT11+/btGPNERETo/v37rzwPXJIGDBigoKAg8+3atWsWrRsAAAAA8H5KVKH7n3/+0b1795Q+fXpJkre3tx4+fKjDhw+b59m2bZuMRqNKlSr1yuU4OjrKxcUlxg0AAAAAgPgW68PL165dG+uF1q1bN1bzPX782NxrLUmXL1/WsWPHlCZNGqVJk0bDhg2Tj4+PPD09denSJfXt21e5cuVStWrVJEn58uVT9erV5evrqylTpig8PFydO3dW06ZNGbkcAAAAAGB1sQ7d9evXj9V8BoNBkZGRsZr30KFDqlChgvl+z549JUktW7bU5MmTdeLECc2ZM0cPHz5UhgwZVLVqVX377bdydHQ0P2bBggXq3LmzKlWqJBsbG/n4+Ojnn3+O7WoBAAAAAGAxsQ7dRqMx3p+8fPnyr732t7+//38uI02aNFq4cGF8lgUAAAAAQLxIVOd0AwAAAACQmMT5kmEhISHauXOnrl69qmfPnsWY1rVr17cuDAAAAACAxC5Oofvo0aOqWbOmnjx5opCQEKVJk0Z3796Vs7Oz3N3dCd0AAAAAACiOh5f36NFDderU0YMHD5QsWTLt27dPV65cUbFixfTDDz/Ed40AAAAAACRKcQrdx44dU69evWRjYyNbW1uFhYUpc+bMGj16tAYOHBjfNQIAAAAAkCjFKXTb29vLxub5Q93d3XX16lVJkqurq65duxZ/1QEAAAAAkIjF6ZzuIkWK6ODBg8qdO7c++eQTDR48WHfv3tW8efNUoECB+K4RAAAAAIBEKU493SNGjFD69OklScOHD1fq1KnVsWNH3blzR1OnTo3XAgEAAAAASKzi1NNdvHhx89/u7u7auHFjvBUEAAAAAEBSEaee7ooVK+rhw4cvtAcHB6tixYpvWxMAAAAAAElCnEL3jh079OzZsxfaQ0NDtXv37rcuCgAAAACApOCNDi8/ceKE+e8zZ84oMDDQfD8yMlIbN25UxowZ4686AAAAAAASsTcK3YULF5bBYJDBYHjpYeTJkiXThAkT4q04AAAAAAASszcK3ZcvX5bJZFKOHDl04MABubm5mac5ODjI3d1dtra28V4kAAAAAACJ0RuF7qxZs0qSjEajRYoBAAAAACApidMlwyTp0qVLGjdunM6ePStJ8vLyUrdu3ZQzZ854Kw4AAAAAgMQsTqOX+/v7y8vLSwcOHFChQoVUqFAh7d+/X/nz59fmzZvju0YAAAAAABKlOPV09+/fXz169NCoUaNeaO/Xr5+qVKkSL8UBAAAAAJCYxamn++zZs2rTps0L7a1bt9aZM2feuigAAAAAAJKCOIVuNzc3HTt27IX2Y8eOyd3d/W1rAgAAAAAgSXijw8u/+eYb9e7dW76+vmrXrp3++usvlS5dWpK0d+9eff/99+rZs6dFCgUAAAAAILF5o9A9bNgwdejQQV9//bVSpkypH3/8UQMGDJAkZciQQUOHDlXXrl0tUigAAAAAAInNG4Vuk8kkSTIYDOrRo4d69OihR48eSZJSpkwZ/9UBAAAAAJCIvfHo5QaDIcZ9wjYAAAAAAC/3xqH7gw8+eCF4/9v9+/fjXBAAAAAAAEnFG4fuYcOGydXV1RK1AAAAAACQpLxx6G7atCmXBQMAAAAAIBbe6Drd/3VYOQAAAAAA+J83Ct1Ro5cDAAAAAID/9kaHlxuNRkvVAQAAAABAkvNGPd0AAAAAACD2CN0AAAAAAFgIoRsAAAAAAAshdAMAAAAAYCGEbgAAAAAALITQDQAAAACAhRC6AQAAAACwEKuG7l27dqlOnTrKkCGDDAaDVq9eHWO6yWTS4MGDlT59eiVLlkyVK1fWhQsXYsxz//59NW/eXC4uLkqVKpXatGmjx48fv8O1AAAAAADg5awaukNCQvThhx9q0qRJL50+evRo/fzzz5oyZYr279+v5MmTq1q1agoNDTXP07x5c50+fVqbN2/W+vXrtWvXLrVr1+5drQIAAAAAAK9kZ80nr1GjhmrUqPHSaSaTSePGjdOgQYNUr149SdLcuXPl4eGh1atXq2nTpjp79qw2btyogwcPqnjx4pKkCRMmqGbNmvrhhx+UIUOGd7YuAAAAAAD8W4I9p/vy5csKDAxU5cqVzW2urq4qVaqUAgICJEkBAQFKlSqVOXBLUuXKlWVjY6P9+/e/ctlhYWEKDg6OcQMAAAAAIL4l2NAdGBgoSfLw8IjR7uHhYZ4WGBgod3f3GNPt7OyUJk0a8zwvM3LkSLm6uppvmTNnjufqAQAAAABIwKHbkgYMGKCgoCDz7dq1a9YuCQAAAACQBCXY0O3p6SlJunXrVoz2W7dumad5enrq9u3bMaZHRETo/v375nlextHRUS4uLjFuAAAAAADEtwQburNnzy5PT09t3brV3BYcHKz9+/fL29tbkuTt7a2HDx/q8OHD5nm2bdsmo9GoUqVKvfOaAQAAAACIzqqjlz9+/FgXL1403798+bKOHTumNGnSKEuWLOrevbu+++475c6dW9mzZ9fXX3+tDBkyqH79+pKkfPnyqXr16vL19dWUKVMUHh6uzp07q2nTpoxcDgAAAACwOquG7kOHDqlChQrm+z179pQktWzZUrNnz1bfvn0VEhKidu3a6eHDhypbtqw2btwoJycn82MWLFigzp07q1KlSrKxsZGPj49+/vnnd74uAAAAAAD8m8FkMpmsXYS1BQcHy9XVVUFBQUn6/O5RR+9auwTEg/5F0lm7BMSHhQZrV4D40uy9/xpNEgzD2CaTAtMQtsckw8A2mSQk8agZ2xyZYM/pBgAAAAAgsSN0AwAAAABgIYRuAAAAAAAshNANAAAAAICFELoBAAAAALAQQjcAAAAAABZC6AYAAAAAwEII3QAAAAAAWAihGwAAAAAACyF0AwAAAABgIYRuAAAAAAAshNANAAAAAICFELoBAAAAALAQQjcAAAAAABZC6AYAAAAAwEII3QAAAAAAWAihGwAAAAAACyF0AwAAAABgIYRuAAAAAAAshNANAAAAAICFELoBAAAAALAQQjcAAAAAABZC6AYAAAAAwEII3QAAAAAAWAihGwAAAAAACyF0AwAAAABgIYRuAAAAAAAshNANAAAAAICFELoBAAAAALAQQjcAAAAAABZC6AYAAAAAwEII3QAAAAAAWAihGwAAAAAACyF0AwAAAABgIYRuAAAAAAAshNANAAAAAICFJOjQPXToUBkMhhi3vHnzmqeHhobKz89PadOmVYoUKeTj46Nbt25ZsWIAAAAAAP4nQYduScqfP79u3rxpvu3Zs8c8rUePHlq3bp2WLVumnTt36saNG2rQoIEVqwUAAAAA4H/srF3Af7Gzs5Onp+cL7UFBQfr111+1cOFCVaxYUZI0a9Ys5cuXT/v27dNHH330rksFAAAAACCGBN/TfeHCBWXIkEE5cuRQ8+bNdfXqVUnS4cOHFR4ersqVK5vnzZs3r7JkyaKAgABrlQsAAAAAgFmC7ukuVaqUZs+erTx58ujmzZsaNmyYPv74Y506dUqBgYFycHBQqlSpYjzGw8NDgYGBr11uWFiYwsLCzPeDg4MtUT4AAAAA4D2XoEN3jRo1zH8XKlRIpUqVUtasWbV06VIlS5YszssdOXKkhg0bFh8lAgAAAADwSgn+8PLoUqVKpQ8++EAXL16Up6ennj17pocPH8aY59atWy89Bzy6AQMGKCgoyHy7du2aBasGAAAAALyvElXofvz4sS5duqT06dOrWLFisre319atW83Tz58/r6tXr8rb2/u1y3F0dJSLi0uMGwAAAAAA8S1BH17eu3dv1alTR1mzZtWNGzc0ZMgQ2dra6rPPPpOrq6vatGmjnj17Kk2aNHJxcVGXLl3k7e3NyOUAAAAAgAQhQYfuf/75R5999pnu3bsnNzc3lS1bVvv27ZObm5sk6aeffpKNjY18fHwUFhamatWq6ZdffrFy1QAAAAAAPJegQ/fixYtfO93JyUmTJk3SpEmT3lFFAAAAAADEXqI6pxsAAAAAgMSE0A0AAAAAgIUQugEAAAAAsBBCNwAAAAAAFkLoBgAAAADAQgjdAAAAAABYCKEbAAAAAAALIXQDAAAAAGAhhG4AAAAAACyE0A0AAAAAgIUQugEAAAAAsBBCNwAAAAAAFkLoBgAAAADAQgjdAAAAAABYCKEbAAAAAAALIXQDAAAAAGAhhG4AAAAAACyE0A0AAAAAgIUQugEAAAAAsBBCNwAAAAAAFkLoBgAAAADAQgjdAAAAAABYCKEbAAAAAAALIXQDAAAAAGAhhG4AAAAAACyE0A0AAAAAgIUQugEAAAAAsBBCNwAAAAAAFkLoBgAAAADAQgjdAAAAAABYCKEbAAAAAAALIXQDAAAAAGAhhG4AAAAAACyE0A0AAAAAgIUQugEAAAAAsBBCNwAAAAAAFkLoBgAAAADAQpJM6J40aZKyZcsmJycnlSpVSgcOHLB2SQAAAACA91ySCN1LlixRz549NWTIEB05ckQffvihqlWrptu3b1u7NAAAAADAeyxJhO6xY8fK19dXX375pby8vDRlyhQ5Oztr5syZ1i4NAAAAAPAes7N2AW/r2bNnOnz4sAYMGGBus7GxUeXKlRUQEPDSx4SFhSksLMx8PygoSJIUHBxs2WKtLPTxI2uXgHgQHOxg7RIQH55YuwDEmyT+3fHeCLV2AYgPSf23HJDoJPFtMuozx2QyvXa+RB+67969q8jISHl4eMRo9/Dw0Llz5176mJEjR2rYsGEvtGfOnNkiNQLx6cV3LgCr8nW1dgUA/p/rKLZHIEFxfT+2yUePHsn1Neua6EN3XAwYMEA9e/Y03zcajbp//77Spk0rg8FgxcrwNoKDg5U5c2Zdu3ZNLi4u1i4HeO+xTQIJB9sjkLCwTSYNJpNJjx49UoYMGV47X6IP3enSpZOtra1u3boVo/3WrVvy9PR86WMcHR3l6OgYoy1VqlSWKhHvmIuLCx9eQALCNgkkHGyPQMLCNpn4va6HO0qiH0jNwcFBxYoV09atW81tRqNRW7dulbe3txUrAwAAAAC87xJ9T7ck9ezZUy1btlTx4sVVsmRJjRs3TiEhIfryyy+tXRoAAAAA4D2WJEJ3kyZNdOfOHQ0ePFiBgYEqXLiwNm7c+MLgakjaHB0dNWTIkBdOHQBgHWyTQMLB9ggkLGyT7xeD6b/GNwcAAAAAAHGS6M/pBgAAAAAgoSJ0AwAAAABgIYRuAAAAAAAshNANAEiQzp49q2PHjlm7DAAAgLdC6AbiCWMSAvHn+PHjKliwoLZu3WrtUgAAwDtw+fJl3b5929plWAShG4gHT548kcFgkNFotHYpQKJ34sQJeXt7a+DAgerVq5e1ywGQwLHTG0j8li5dqmbNmmn27Nm6d++etcuJd4Ru4C31799fuXLlUlBQkGxsbBQZGWntkoBE69y5c6pQoYKaNGmib775RhI/qAG8mtFolMFgkCQ9e/ZMoaGhkvjcABKTX3/9Vb6+vmrSpIk++eQTpU2b1tolxTtCN/CWKleurGzZsqlChQoKCgqSra0tPd5AHBw7dkzFihXTgwcPdPHiRR0/flwmk8n8gxoA/s3G5vlP2e+++05169ZV1apVtWHDBj43gERi+/btGjx4sGbNmqXu3burVKlS5mlJaecZoRt4S5UqVdL3338vBwcHffLJJ/R4A3Fw/PhxlSxZUoMHD1Z4eLhu3LghX19fnThxwtqlAUjgJkyYoF9++UVFixaVh4eHateurUmTJlm7LACxcOjQIZUoUULVqlUztwUEBGj06NH6/PPP9d133+nJkydWrDB+GExJaRcC8I5F9cIZjUbt2bNHffv2VWhoqHbu3ClXV1dFRkbK1tbW2mUCCdrjx49Vt25dlS5dWt99950kKTg4WEWLFlXq1Kk1Y8YMffjhh1auEkBCYTQazT3ckvTLL78oQ4YMql+/viRpzJgx6t+/v8aPH6/OnTtbqUoAsfHll1/q8uXL2rFjhyRp4MCB+uOPPxQYGCh3d3fdvHlTZcqU0YwZM2RnZ2fdYt8CoRuIg39/4Ue17du3T926dVN4eLg5eL9sXgDPhYeHy97eXufOnVPevHklSWFhYXJ0dNSjR49UpEgRgjcAs+innKxZs0YPHjzQvHnz1KFDBzVq1Mg8348//qh+/fpp3LhxBG8gAfn37+K9e/fqk08+0UcffaTbt2/r2bNn6tWrlxo1aiRPT0998803WrBggXbs2KH06dNbsfK3QxIA3lD0D4u1a9dqypQpmjZtmv766y+VLl1aEydOlKOjI4eaA//h/Pnzat++vSpWrKg5c+bo8OHDkiRHR0dFREQoZcqUOnr0qB48eKC2bdtyqDnwnoseuAcOHKhGjRpp0qRJ2r59u7Zs2aK7d++a5+3Vq5fGjBmjrl27avny5dYqGcC/RA/ckZGRKlOmjDZv3qwPP/xQn376qQ4cOKAOHTrI09NTkpQjRw65uLjI3t7eWiXHC3q6gTjq27evFixYoJIlS+qvv/6Svb29unXrpi+++EK7d+9Wv379FBoaqi1btihNmjTWLhdIUI4fP66KFSuqatWqsrOz06pVq1S+fHlNnz7dvCc7IiJCdnZ2evTokUqWLKmIiAitWrVKBQoUsHL1AKzp0KFDGjRokIYOHaoPPvhAy5YtU8eOHTV06FB16dJFqVOnNs+7ePFiNWzYMFEflgokFbt27ZK/v7/WrVsng8GgcuXKqU2bNipcuPBLT8l88uSJGjVqpLRp02rOnDmJeoBEerqBWIo+IvnChQu1aNEirV69WqtWrVLnzp118uRJpUiRQpJUtmxZ/fjjjwoKClKPHj2sVTKQIJ0+fVplypRRt27dtGjRIs2bN0/dunXT77//rps3b0p6vr3Z2dnp2bNnSpkypfbv3y8XFxclT57cytUDeNeif/9OmjRJo0aNUsqUKVWiRAmlSZNG7du31+TJkzV06FBNmDBBDx48MM/ftGlT2dnZKSIiwhqlA/h/c+bMUevWrXXt2jVVq1ZNBQsW1Pr161W1alXt3LlTtra25iNDQ0JCdO7cOfn4+Oj69euaOXOmDAZDoh7NnN1+wH9Yt26dKleurGTJkpn3wv3555+qUKGCSpQooWXLlql3794aP368Pv30Uz1+/Fh3796Vt7e3Vq1apfz581t7FYAEIzg4WDVq1FDOnDnVoUMHc3vUyKTXrl1T6tSp5eHhIWdnZzk4OEiSXFxcdOjQoUS9lxvAmzOZTObDUf/880/Z2tpqw4YN8vDw0Pnz5+Xl5SVJat++vQwGgzp16qTg4GANGTJEKVOmNC+Hnm7AeqZOnaquXbtq5syZqlevnrmT6rffftOoUaNUr149bd68WSVKlFBoaKj69u2rI0eOyNXVVQcPHpSdnV2iH5yYnm7gNSZPnqxu3bppypQpCg0NNW/sISEhyp07t/bt26fWrVvr+++/V4cOHWQ0GrV06VKtXr1aERERKlSoUIw9d8D7zsXFRd26dVNQUJDGjx+vsLAwjR07VlOmTFGJEiW0adMmffjhh6pbt6769eundevWmXuoCNzA+8VoNJq3++7du8vb21u+vr6aNm2aQkJCzOOpRGnXrp3GjBmjgIAA8496ANa1ZMkSdezYUZs3b1bz5s2VPHlyc491rVq1NHjwYGXJkkXffvutgoODZTAY1Lx5c3Xt2lW//fab7O3tFRERkagDt0RPN/BaLVu21LFjx7RkyRIZjUZ16tRJyZIlU8mSJdWkSRMNGTJES5YsMY+Y+vTpUy1evFgffvhhjL3qif2DAogPUedo9+rVS56enurXr5/27t2rkydPavPmzfL29patra0+//xz/fnnnxoyZIi2bNmiokWLKmPGjNYuH8A7FtXDffPmTUVGRmrZsmWytbVV8+bNFRoaqiFDhsjOzk5+fn7Knj27JKlHjx7q3r27+VBUdtYB1nP//n0tXbpUbm5uSpYsWYxpUdtnlSpVVLduXU2ePFnh4eFycXFR6dKlVbp0aUnPB1tLCkeq0NMNvEJYWJicnZ01depUFSxYUGvXrtXkyZP19OlTNWrUSP369ZODg4Ps7e117do1nTlzRj4+Prp3755Gjhxp7fKBBMfOzk5LlixRnTp11Lx5cw0aNEiHDx9W1apVlTt3bvPOKW9vb7Vs2VKnTp3Sxo0bCdzAe2zevHnKnTu3du3apWzZspmPHGvTpo2GDh2qJUuWaMqUKbpw4YL5MQRuIGFIkyaNunXrpipVqqh169baunWrDAaD+Ra1PX/66ad69OiRrl279sIykkrHVeLfbQBYgMlkkqOjoyRp6dKlSpUqlU6ePKmLFy/Kzs5OHTp0UPfu3RUSEqImTZrIzc1Nbm5uSpUqlfbt25ckzj0B4tvFixc1evRotWvXTpLUoUMHRUZGauTIkfr555/Vvn17ZcmSRdLzPdspUqTgEFHgPZcxY0aVLVtWe/bsMX+vhoWFydHRUW3btpWNjY3atWunLFmyKHfu3ObHEbgB63n69KmePn2qNGnSqFy5cnJwcND48ePVo0cP/fTTT6pUqVKM8Rr27t2rkiVLKk+ePFau3HK4ZBjwGoMHD9bEiRM1ZswY2dvba+bMmbp79658fX3VsWNHOTg46NChQwoODpaLi4uKFi0qGxsb82G0AJ47evSoFi9erH/++UczZsyQjY2NecfWzz//rNGjR6t169Zq3bq1smXLZt1iAViF0WiMcQ3fqLZ9+/apa9euevjwofbv36+0adPq2bNn5oEW161bp5o1a7KjG0gAfv/9d61fv1537txR165d9fHHH0t6HqwnTpyo06dPm4O3JD18+FDNmzdX3rx59eOPP1qzdIsidAMvYTKZFBgYqMqVK6tv375q2bKlJCk0NFS+vr7as2ePevbsqbZt275wjsrLfjQA77PIyEh98cUXWr9+vbJmzaqTJ09KksLDw2Vvby/p+WWAevfura+++kr9+/dnpxXwnon+3bl79249fvxYjo6OKl++vGxsbHTo0CF17txZISEh2r59u9KlS2fu8Y7CEWaAdc2cOVODBw9Wjx49VKBAAVWrVi3G9N27d+uXX37RqVOnNGnSJJUrV061atXSrVu3zEeKJtVTQwjdwCs8fvxY3t7e6tChg/z8/My91yaTyXwZsCZNmmjAgAHmve0A/if6F2dwcLD69u2rVatWqXv37urZs6ccHR1jBO9p06apQoUKMQ4RBZD0RQ/cffr00YIFC5QiRQpdunRJdevWVdeuXVWhQgUdOHBAPXv21KNHj7R582a5u7tbuXIAUZYtW6Yvv/xSs2fPVsOGDc3tLVu2lJOTk6ZOnSrpefCePHmyzp49q2fPnikiIkKnTp2Svb19kt5xRnccIOll+57s7OyUKlUqbd682Xw/MjJSBoNBBQsW1NOnT3Xv3j1zYADwXNT2FBQUpPDwcD148EAuLi4aPXq0qlevrjVr1mj69OnmwB0eHi7p+eV+CNzA+2PJkiWS/jdK+a+//qq5c+dq9erV+uOPP3To0CEFBgbqhx9+0MGDB1WyZEl9//33CgsLU69evaxZOoBobt++rWnTpqlv374xAnf9+vW1YcMGrVixQr6+vpKkjz/+WJ06dZKHh4dSpUplDtxJ4bJgr0Poxnsv+nVAr127pnv37un27dtycnLSzz//rK1bt6pDhw569uyZeURUW1tbjRs3TuPGjTO3Afhf7/a6devUsGFDlShRQj4+PlqyZIlcXFw0YcIE5c6dWwsWLIgRvAG8Xz7++GMNHjxYERERMhqNkqQjR46oXLlyKlmypNKkSaMiRYpoxowZunTpkmbMmCHp+dUNli1bptmzZ1uxegDRBQcH68iRIypcuLC5bfHixXry5Im2bdumn376SRs3blTr1q0lSWXLltWPP/6o3bt3mwN3Uj+tLGmvHfAfoo+cOGTIEK1fv17379+Xq6ur+vXrp88++0yLFi1S06ZNdezYMbm5uenOnTt6+PCh5s2bJxsbG87hBqIxGAz6/fff1bBhQw0ePFgGg0H//POPPvvsM129elV9+vTRhAkT1L17d02YMEH29vbmvd8A3g9r167VtWvXdP78ednZ2enkyZMqWLCgnjx5otDQUPN84eHhyp8/v77++mv5+flp8ODBypgxowoWLCiJc7iBhOLvv/9WeHi48ubNa26rWbOmqlevrlSpUilXrlx69OiROnfurMaNG6t69ermUzWNRmOSD9wSoRvvuage7m+//VYTJ07U9OnTFRQUpJMnT+rzzz/X/fv35efnp5MnT2rixIl6/PixcubMqR9++EG2trYEbuBfwsPDNXv2bHXo0EFfffWVub1AgQLq3LmzcubMqQYNGmj8+PHq37+/qlSpYsVqAVhDlixZFB4erkWLFuncuXPavXu39u7dq9q1a6tRo0Zat26d6tSpY/5+tbe3V65cuZQyZcoYyyFwAwlD/vz5lSxZMv3000+aPHmyJCllypTmo0GdnJyUM2dOVahQQR988EGMx74vv6MJ3XjvPXr0SFu3btV3332nBg0amNuzZs2qLl26KHfu3Kpatap++OGHGKMpvg+HwgBv6tmzZzp//rxy5col6fnRJCaTSR06dNDBgwc1b948ValSRa6urvrll1+S5AilAF4vS5Ys+vLLL/XVV1/p/v37OnXqlCSpdu3a6tq1qxo3bqyZM2eqfPnysrW11ezZs+Xh4fFC6AaQMLi6uqpWrVpat26dChYsqE6dOpm/3w0Gg0JDQ/XLL7/Izc1N2bNnt3K11vF+7FoAXiM0NFSnTp0y72kzmUwyGo3q2LGjateurdWrVysyMlKRkZExHkfgBv43aNrjx49lNBqVPHlyVaxYUdu2bdPVq1dlMBhkMBhkY2MjNzc33bt3z/zDmcANvJ/SpEmjBw8e6N69e8qWLZt27dolSXJ0dFSvXr3Us2dPtWrVSt7e3vr4449169YtrV69WgaDwXz+N4CEw9nZWf3791e6dOk0ZswYDRkyROHh4bp796727dunWrVq6dKlS5o/f/57ux0TuvFeedlG7ubmplq1amnFihUxQoKDg4NSpkypBw8eyNbWlpAN/EvUoGkbNmxQ//79tX//fknSJ598IltbW/3000+6du2aOVyHhITI3d09xjmbAN4P//7+LVq0qJYvX67q1atrwoQJmjRpkiQpc+bMGj58uAICAjR+/Hj9+OOPOnDggHmwpfflUFQgMTEajfrggw+0aNEi5cuXTz/99JNy5MihAgUKqGvXrrK1tdXRo0dlZ2f33m7HpAi8N6Kff3358mVFRESYL09Uo0YNjR8/XmPHjlXfvn2VIUMGhYaG6ubNmypWrJg1ywYSLIPBoJUrV6pFixbq3bu30qVLJ+n5JUKuXr2qBQsWaNu2bSpdurTu37+vDRs2aO/evXJycrJy5QDepejfv5MnT9azZ8/k5+cnOzs75c6dW2FhYfr1118lSX5+fpKeh/KiRYualxEZGcnObyCBihpYOF++fJo7d67OnDmjXbt2KXny5CpUqJAqVKggGxub9/rUTIOJax3hPdO/f38tWbJEjx49UuXKlTV27FhlyJBBP//8sxYuXKjbt2+raNGiunLlip4+fapjx469tx8QwOucOHFCNWrU0PDhw9WqVStJz3u/b9++LQ8PDx04cEArV67UoUOHlCNHDnXr1s08WimA90+fPn20aNEi9e/fX7Vr11a2bNkkSWfPntWECRO0f/9++fr6qkOHDtYtFECcRB0B9zLv++DDhG68V9avX68ePXpo5MiRkqSePXsqY8aMmjdvnnLlyqWAgADt2rVLFy9eVJYsWTRgwADzoTAEbyCmHTt2qE+fPvrtt9/k4uKi+fPna+HChbpx44YKFiyo2bNnK3ny5OZDyd7nL1vgfTdt2jQNHjxY69atU4kSJczt4eHhsre318WLFzVu3DitWbNGP//8sz799FMrVgsgyuuC9OseI8k8ejljuBC6kcT9e69aQECAAgIC1LNnT0nS3bt3VaxYMaVPn15z5sxRnjx5XlgG1wEFXm7jxo1q27atPv30U23fvl05c+ZUjhw5lC9fPo0aNUo//vgjP5wBSJLat28ve3t7TZw4UWfPnlVAQICmTZum0NBQjR07VhUrVtSpU6fk7++v7t27870LJADRf0c/e/ZMRqNRTk5O/xmkCdovousOSZbJZDJ/UEyYMEGnT5/Wnj17VLVqVfM86dKl05EjR1SsWDG1bdtW48ePj3EOmcR1QAHpf1+gDx8+VGRkpNKmTavq1aurV69eOnLkiGrWrKkvv/xS+fLlU1hYmKZPn86528B76mU/uNOnT68ZM2bIzc1Nv//+uzw9PVWhQgVduHBBX3zxhc6fP68CBQqoQIECktjhDSQEUb+jv/vuO+3Zs0dPnjzRgAEDVKNGjVc+Jvr2v2rVKkVERKhRo0bvpN6EjGP9kCRF3+DHjBmjPn366OnTp7p586bWrl2rDRs2mA99SZs2rQ4fPqwDBw5o2rRp1iwbSLAMBoNWr16tTz75RJ988ol8fHz04MED9ejRQ7/++qtGjx6tfPnySZKGDx+ue/fucf428B4yGo3m79+HDx+a2xs3biwfHx/Nnz9fjRs31vDhwzVy5Ej5+voqT548Cg8Pj7EcAjeQMEyYMEG//PKLihYtKg8PD9WuXdt8tYF/i/77e8qUKWratKl5kNX3HYeXI0k7dOiQJk+erJYtW6pcuXK6d++eqlevLhcXF/Xv319VqlQxzxscHKzkyZPzRQ+8xKFDh1S9enV16tRJadOm1cSJE5UyZUotWLDAHLZnz56tffv2aeXKlfL391eRIkWsXDWAdyn6D+5Ro0Zp06ZNsrOzk6+vr+rUqSMnJyc9ePBAqVOnlvQ8oNesWVPOzs5asWIFh6MCCcC/T8385ZdflCFDBtWvX1/S886s/v37a/z48ercufNLHzd16lT1799f06dPV8OGDd9p/QkVPd1IspYuXSpfX1/98ccf8vT0lPS8V3v16tUKDg7WyJEjtWXLFvP8Li4usrW1VWRkpLVKBhKkkydP6vr16+ratau++eYbdevWTQcPHlR4eLiaN2+us2fPSnreMxUSEqKdO3cSuIH3TPTAPWnSJH3//feqWbOmnj17pu+//17Dhg3Tw4cPlTp1agUHB2vt2rWqUqWKAgMDtWTJEvOASwCsJ/qpmWvWrNHs2bO1YsWKGEei9OnTR6NHj1b37t01ceJEc3v0wN23b1/NmDGDwB0NoRtJVokSJZQlSxZdv35dv/32m7k9Y8aMWrNmjZ48eaIePXro4MGDMR5HTzfwPyEhIapatao+/fRT3bx509yeKlUq7d69W8+ePVOLFi107tw5ffHFF5o2bZq55xvA+yMqcB86dEhnz57VggUL1Lt3b+3YsUPVq1fX9u3bNXLkSAUFBen+/fsKCAhQlixZdOjQIdnb2ysiIoKebsCKou84GzhwoBo1aqRJkyZp+/bt2rJli+7evWuet1evXhozZoy6du2q5cuXm9unTp2q3r17a9asWfLx8Xnn65CQcXg5kqSoQ1xu3rypTp066e7du2rXrp2++OIL8zz//POPhg4dqqlTpxK0gdc4deqUPvvsMzk6OmrdunVKnz59jIHV8ufPrxw5cmjbtm2yt7e3drkArGTdunXq16+fQkJCtHDhQpUpU0bS8+/kwYMHa+vWrapYsaIGDhwoSXJ2dpbBYGDQNCABOXTokAYNGqShQ4fqgw8+0LJly9SxY0cNHTpUXbp0MZ8eIkmLFy9Ww4YNZWdnpytXrqhx48bq16+fGjRoYMU1SJgI3UiyooL3tWvX1LlzZwUFBalt27b6/PPPX5iXL3zg+V7uqEPLnj59KgcHB0VERMjR0VGnTp1S1apV9eGHH2revHlKly6dOXgHBQXp3r17ypEjh7VXAYAVRR1BtnTpUrVt21bffvut+SoGJpNJQ4cO1dy5c9WvXz916NDB3E4PN2A90c/FjurZNhgMWrx4sfm38dSpU18ZvKX//Y4ODAw0n9KJmAjdSNKiB+8uXbro0aNHaty4sdq3b2/t0oAEJTw83NxLvWHDBs2bN08XLlxQyZIlVbNmTdWqVUunTp1StWrVVKhQIc2fP19p06blBzPwnvr3YEtRQkND1aVLFx07dkzNmzdXx44d5ejoKOl5wJ4xY4Zat27Njm4gAYj+Hf7nn39q27Zt6tWrlzw8PLR+/Xp5eXmZ5502bZo6deqk7t27a8iQIUqZMuVLl4OX45xuJGk2NjYyGo3KnDmzJkyYoLCwMJ08edLaZQEJyunTpzVy5EhJzwdOadCggfLnz682bdro3r17qlu3rs6ePasCBQpo06ZNOnPmjOrUqaP79+/zJQu8h6IH7l27dmnhwoU6ePCgbty4IScnJ40fP14FChTQokWLNHnyZIWFhUl6ft63r68vg5YCCUD0y/t1795d3t7e8vX11bRp0xQSEqJp06bpr7/+Ms/frl07jRkzRgEBAUqRIkWMZfFb4L/R041ELzZ716J+INy+fVvp0qWTjY0Ne+UAScePH1eRIkU0fPhwdevWTfXq1VOtWrXUvXt33blzR4ULF9ann34aY4TS48eP67PPPtPGjRuVJUsWK1YP4F2L/t3Zv39/LVq0SI6OjnJwcFCxYsXUrVs3FS1aVE+ePFHnzp11/vx51ahRQ/369WPMByABunnzpkaMGKFPP/1UFStWlCT9+uuvGjJkiJo2bSo/Pz9lz57dPH/UZwC/o98MPd1IVEwm0wt7x6M2+NftP7KxsVFERITc3d0J3MD/O3PmjLy9vTV48GANGDBAT58+1eXLl1WuXDnduHFDRYoUUa1atcyBe8WKFbpw4YI+/PBDHTt2jMANvIeivjtHjx6t+fPna8GCBfrzzz9Vs2ZNrVq1SoMHD9aBAwfk7OysiRMnyt3dXVevXpWdnZ2VKwfwb/PmzVPu3Lm1a9cuZcuWzfwbu02bNho6dKiWLFmiKVOm6MKFC+bHELjjhtCNROPq1asyGAzm88CmTZumzp076/vvv9epU6dee41Pk8lk/sLfs2dPjOsNAu+jU6dO6ZNPPlG2bNk0dOhQc3u+fPl05MgRlSlTRjVr1tTkyZMlPR/t/7ffftOZM2dkMpnk4OBgpcoBWIPRaDT/fefOHe3du1cjRoxQ2bJl9dtvv2nq1Klq0qSJrl27pmHDhunIkSNydnbWokWLNGXKFK7DDSRAGTNmVNmyZXXp0iXzYGhRp4O0bdtWw4YN048//qhNmzbFeByB+80RupEoDB8+XOXLl9epU6ckSQMGDNBXX32lCxcuaMmSJWrevLl27dr10i/16HvjpkyZonLlyunMmTPvfB2AhOL48eMqVaqUChQooKCgIHXr1k2SlDZtWmXKlEnt2rVTkSJFYlxOb9KkSdq/f7+KFi3Kly3wHoo6h3vbtm2Snh9aXrVqVR0+fFgdOnTQiBEjNH36dNWqVUs7d+6Un5+fTp48KScnJ/P4Knx2ANYTfcdZlPLly2vw4MHKmzevatSooXv37snR0VHPnj2TJLVu3VqrVq0yX20AcUfoRqJQsGBB5c+fX76+vtq1a5cePXqkjRs3yt/fX5MnT1aBAgXUqlUr7dy5M0bwjh64p06dqoEDB2rp0qUqXLiwFdcGsJ5Dhw6pRIkS6tu3r7Zs2aIhQ4Zo4cKF6ty5syRp8uTJatCggXbv3q1Ro0ZpzJgxat++vSZNmqSFCxcqc+bMVl4DAO9S9J3UAwcOVLt27RQcHKxixYrJ09NT69evV4kSJeTr6ytJ8vDwUMmSJVWlShXlz5/f/NiXjXQO4N2IPvjh7t27tWHDBvMOtNKlS2vKlClKly6dypcvr7t378rBwcHc412nTh0GP4wHfAIiUahbt666du2qdOnSqUuXLtq3b58yZcokSSpVqpR69eolb29vtW7d2tzjHRkZGSNw9+3bV9OnT1fDhg2tuSqAVT158kQdO3bUkCFDZGtrqyZNmmj48OFasmSJOXgvX75cTZo00ebNm7Vo0SI9ffpUf/zxhz788EMrVw/gXZoyZYpatmypkJAQBQYG6vr165oyZYpy5sxpPsUkNDRU165d040bNyRJO3fuVMOGDTVs2DBzDzcA64keuPv06aMmTZqoW7duqlKlinx8fLR9+3YVL15cP//8s1xdXVWpUiXdvn3bfKm/KFzm7y2ZgAQsMjIyxv1NmzaZatWqZXJycjKdPHkyxrQjR46YPv/8c1OyZMlMR48eNbdPmjTJlCZNGtPy5cvfRclAomE0Gk0mk8kUFBRkmjp1qildunSmzp07m6c/ePDA9PTpU1NYWJi1SgRgJVOmTDEZDAbT6tWrTUuXLjU5OzubChQo8MJ375IlS0wlSpQw5cuXz1SgQAFT3rx5TeHh4SaT6X+fMQDevcWLF8e4P2PGDJO7u7tp//79pjt37piOHDli+uijj0w1a9Y0HThwwGQymUx79uwx5cmTx/T5559bo+QkjaEkkWBF3zO3bds25cyZU1WqVJGdnZ0eP36sli1bavbs2SpYsKAkqUiRIurcubNy5cplbjt69Kg6d+6sJUuWyMfHx2rrAiREUUeCuLi4qGnTppKkr776SjY2Nho/frxSpUplxeoAWMvSpUvVsWNHbd26VRUqVFBgYKBKly6trVu36vr16ypQoIB53saNG8vW1laXLl1SSEiIvv76a9nZ2ZkHZQLw7n388ce6ffu2fHx8ZGNjIxsbGx05ckTlypVTyZIlZTQalS5dOs2YMUM+Pj6aMWOGSpQoIW9vby1btkxeXl7WXoUkh+t0I0EyRTsXe8CAAVq+fLkGDx4sHx8fOTs7a/PmzRo/frzu3LmjX3/9NcYPgChRX/jnz59Xnjx53vUqAIlOcHCwli5dqnbt2qlfv34aOXKktUsC8I5Nnz5d7du3lyQdO3ZMhQoVkvR8xPJ69erp3r17WrdunT744INXXjaIwA1Yz9q1a9W1a1edP39ejo6OOnnypAoWLKgvv/xSd+/e1bp162Q0GhUZGSl7e3stWLBAfn5+On36tDJmzGheDttx/OKcbiRIUV/iY8aM0cyZMzVz5kzVr19fzs7OkqQqVaqoa9eucnNzU7t27XT06NEXlhH1QUHgBmLHxcVFjRo10qxZs/Tll19auxwA79i0adPUsWNHzZo1S76+vvL29taePXskSW5ublq7dq1cXV316aef6sKFC68cjZwf6oD1ZMmSReHh4Vq0aJH69+9vHnm8du3a+u2337Ru3TrZ2NjI3t5ekmRvb69cuXIpZcqUMZbDdhy/6OlGghK119xoNOrJkyeqX7++atSooV69epnniYiIMF9ze/v27erfv78KFSqk6dOnW6tsIEl5Ve8VgKRr9erVatCggVavXq26devq9u3b6t27t1asWCF/f3+VLVtWknT37l3VqlVLT58+1ZIlS5QvXz4rVw4guvv372vs2LGaNWuW7t+/r1OnTilnzpwKCwtTv379NHXqVM2cOVPly5eXra2tWrVqJYPBoPXr1/Pdb0Gc040EI/o53I8fP5aLi4tOnz6ttm3bSvrfYS52dnYKDQ1VYGCgKlSooIkTJ6pYsWLWLB1IUvjSBd4/pUqV0o4dO1SuXDlJkru7u8aOHStJqlatmjl4p0uXTr///ruKFy+u4cOHa/78+dYsG8C/pEmTRg8ePNC9e/eUPXt27dq1Szlz5pSjo6N69eql5MmTq1WrVkqfPr0cHR2VIkUK7du3z9zpxeX9LIOebiQI0XvWunbtqj///FMbN27Uxx9/rDRp0mj16tXmy4DZ2trq4MGDWr9+vbp3767UqVNLEh8UAADEwevO3bx796569uypFStWaNOmTSpTpowkKSgoSClSpOAQVCAB+Pdv4F9//VUeHh7aunWrdu7cqTZt2sjPz888/ciRI7p27ZpsbW1Vo0YN2draxjiSFPGPVxZWFz1wHzp0SEeOHDEP4NSqVStNnDhRffr00Q8//CBbW1uFhYVpyJAhMhgMMUZXJnADAPDmXhec06VLp7Fjx8pgMKhmzZpatWqVKlasKFdXV0kMtgRYW/TAPXnyZD179kx+fn6ys7NT7ty5FRYWpl9//VWSzMG7aNGiKlq0qHkZkZGRBG4Lo6cbCcaSJUu0cOFCpUyZUnPnzpWNjY2Cg4M1fvx4LVu2TEajUR988IGuXbumsLAwHT58WPb29px/CgCAhd29e1dffvmlnjx5oq1bt1q7HAD/0qdPH/PgabVr11a2bNkkSWfPntWECRO0f/9++fr6mgdWw7tF6EaCEBISos6dO8vf319ZsmTRvn37zNOePHmiY8eOafHixTIajUqfPr369esnOzs7DoUBACAO4tJDHRQUpJQpU3JkGZDATJs2TYMHD9a6detUokQJc3t4eLjs7e118eJFjRs3TmvWrNHPP/+sTz/91IrVvp8I3bCKl/VO37x5Uz/++KPmzZunDh06aNiwYa9dBoe0AQDwZvbv369SpUpJivv3KGOoAAlL+/btZW9vr4kTJ+rs2bMKCAjQtGnTFBoaqrFjx6pixYo6deqU/P391b17d34/WwFdhHjnon9ZX7lyRU5OTrKxsVH69OnVt29fhYeHa+PGjXJyctKAAQMk/W9PXXR8YAAAEHvXrl1TrVq1VK5cOa1cuVK2trZxCt4EbsB6XtZxlT59es2YMUNubm76/fff5enpqQoVKujChQv64osvdP78eRUoUEAFChSQRMeVNfCpiXfKZDKZv6yHDBmi2rVry9vbW6VLl9aiRYvk7u6uQYMGqVSpUlq7dq2+//57SXohcAMAgDeTNm1ajRs3TseOHVOzZs0kyRy8XyX6AZFGo9HiNQJ4NaPRaA7cDx8+NLc3btxYPj4+mj9/vho3bqzhw4dr5MiR8vX1VZ48eRQeHh5jOQTud4/Dy2EVI0aM0NixYzV9+nRFRETowIED+vHHHzVq1Cj17dtXN2/e1Pfff69169Zp6NCh+uKLL6xdMgAAiVZU79jTp0+1Zs0a9enTR+XKldOCBQskvbznK3qP2ty5c5UiRQrVr1+fnm7ACqJvj6NGjdKmTZtkZ2cnX19f1alTR05OTnrw4EGMS+nWrFlTzs7OWrFiBYMOWxmhG+9cSEiIatWqpfr166t79+7m9gkTJqhbt27avHmzKlWqpOvXr2vZsmXq0qULe+QAAIiDp0+fKlmyZOYe638H77Jly2rRokWSYp7+Ff0H/rRp09ShQwetXbtWtWvXts6KAO+x6NvjpEmTNGjQIH311Vdav369Hj9+rCpVqqhfv35KlSqVgoODtWPHDo0fP1737t3TwYMHudpPAsCuSrxTkZGRCg8P18WLF+Xs7CxJioiIkNFoVJcuXVSvXj3NmDFDz549U8aMGc2DPbzu0DcAAPCizZs3q1+/frpy5Yr5x7bJZFKyZMlUr149jRkzRvv371e3bt0k6aWBe+rUqerTp4+WL19O4AasJGp7PHTokM6ePasFCxaod+/e2rFjh6pXr67t27dr5MiRCgoK0v379xUQEKAsWbLo0KFDsre3V0REBIHbygjdsKjDhw/ryZMnkqTRo0dr//79SpUqlSpWrKjp06crMDBQdnZ25vPE0qRJI5PJJAcHhxjLoacbAIA3s3//fm3atEmTJk3StWvXXgjeNWvWVKdOnbR3715dvHjRPC164O7bt69mzpypBg0aWG09AEjr1q1TixYttG7dOrm6uprbv/nmG1WuXFm7du3S6NGj5ebmpkGDBmnmzJmys7NTZGQkl9dNAAjdsJgzZ87I19dXAwcOVOfOndW/f3+lSpVKktS0aVMlT55cvXr10r1798zX3P7777/l7u5u3cIBAEgCBg0apHbt2snf31/jxo2LEbyNRqNcXFxUvXp1HT9+XFevXpX0vx61sWPHauDAgZo5c6Z8fHystg4AnqtUqZI+/vhjBQcHa/Xq1QoNDZX0/AiVb7/9VlWrVtXChQs1b948JU+eXAaDQSaTiY6rBILdHrCYPHnyqFmzZhozZoxCQkK0d+9eeXl5SZJq1qypGzduaM6cOcqfP79KlSqlf/75R6GhofL395f08ksiAACAV9u9e7dOnjypO3fuqFWrVurZs6dsbW01a9YsGQwGdevWTZkzZzaf4+3g4KBSpUopXbp0kp6H8cjISC1fvlw///wzgRuwgujjK0RxdnbW+PHjZTQatWPHDk2ZMkUdO3aUo6OjDAaDhg4dqkyZMql169bmx/A7OuFgIDXEu+gfFOvWrVPXrl2VPHlyVa1aVUOGDIlxSMypU6f022+/6caNG3J3d1e/fv3Mvd4cCgMAQOzNmDFDAwYMUMaMGXXixAllzZpVU6ZMUbVq1TRq1CgtXbpUpUuXVt++fZUlSxY9fPhQLVu21OPHj7V582bZ2NiYd3hzHV/AOqL/jt61a5f++ecf5c6dWxkzZlSGDBn05MkT+fn56cyZM/rss8/MwTs6tt+Eh9ANi1m8eLFCQkJUtWpVLVq0SCtXrlTx4sU1YsQIubi4vPJxfFAAAPBmZs6cqXbt2mn16tXy9vaWJJUtW1YuLi7av3+/JGncuHFasmSJ7t69Ky8vLz148EDh4eHatWuX7O3tX9q7BuDdiX6UZ//+/bVo0SI5OjrKwcFBxYoVU7du3VS0aFE9efJEnTt31vnz51WjRg3169dP9vb2Vq4er8MnK+KdyWTSw4cP1bdvX128eFGZM2dWt27dVKdOHR0+fFiDBw/W48ePJUmdO3fWoUOHYjyewA0AQOxt3LhRbdu21dixY1W7dm2lTp1aadOmVa9evXT16lVduHBBktS9e3d9//33at++vbJly6ZGjRpp9+7d5tGNCdyAdUUF7tGjR2v+/PlasGCB/vzzT9WsWVOrVq3S4MGDdeDAATk7O2vixIlyd3fX1atXOTo0EaCnG/Euai/d7NmzNXjwYK1evVpFixZVeHi4fvjhB61du1b29vZydHTUiRMndP36dT4sAACIozt37ihfvnwqXry4xowZo/z588vGxkZDhw7V3LlztX//fqVLl+6V53dyhBlgXdGPMrlz547atm0rHx8ftWjRQr/99puaNWumxo0b68CBA8qUKZO+/fZbFS1aVKGhoXJwcIhxaggSJnZp4q39e79N1H1vb29lzJjR3JNtb2+vPn36qFOnTipYsKCyZs1qDtxchxsAgDdnMpnk5uamU6dO6eTJk/Lz89PDhw+1cuVKjRo1SmPGjJGbm9trf4wTuAHrigrc27Ztk/T80PKqVavq8OHD6tChg0aMGKHp06erVq1a2rlzp/z8/HTy5Ek5OTnJxsZGRqORwJ3A0dONeDN79my5u7urZs2a5rbu3btr9erVOn/+/AuDPERh0DQAAN7MuXPndO3aNWXIkEG5c+eWg4ODAgMDVaRIESVPnlz37t3TqFGj1L59e3qygQTqzJkz5iv7DBw4UEuXLpW/v78yZ84sBwcHDRs2TMePH9fixYvl4OCg8ePHa82aNSpbtqyGDh3KKSGJCP8pxJnRaDT//c8//2jVqlWqXbu2Pv/8c40dO1aS1Lt3b+XIkUOzZs164THS8z30BG4AAGJv0aJFat26tX744Qf98ccfcnBwUGRkpDw9PXX06FHZ2NgoRYoUqlixoqTnPdn0sQAJy5QpU9SyZUuFhIQoMDBQ169f15QpU5QzZ045ODhIkkJDQ3Xt2jXduHFDkrRz5041bNhQw4YNM/dwI3EgdCNOop97cvnyZSVLlkxz587V6dOnlTp1ao0bN04fffSR5s2bp9DQUB05ckSSXtgjx6EwAADE3syZM9WhQwd169ZNs2bNkq+vr6Tnlxa6ceOGPD09tWvXLkVGRsrX11dnz56VxPctkJBMnTpVnTp10qBBg/T7778rZ86cOnLkiDw9PWPMV6RIERkMBtWsWVMFCxbU2bNn1a5dOxkMBplMJnq6ExH+U3hj0Tfy/v37q1q1asqbN6/q1KmjY8eOacKECTp16pSKFi2qs2fPat++fZoxY4ZWrlxp5coBAEi8du7cqcGDB2vixIlq0qSJMmTIIElq2rSpGjRooKVLlyowMFCenp46fPiwLl26pIYNG+rvv/+2buEAzJYuXaqOHTtq69atqlevnj7++GOVLl1ap0+f1vXr12PM27hxY/Xr10+tWrVSgwYNdPLkSfNYSOxIS1w4rhdvJHoP9+LFizVnzhxNmTJFDx8+1KlTp9SiRQv99ddf+uqrr/TLL78oKChItWvX1ujRo7V161Y1aNCA64ACABAHe/bsUf78+VW7dm3zSMXVq1fX9evX5ePjowkTJsjGxkaNGzdW+vTptW/fPvn5+Slz5szWLh2ApOnTp6t9+/aSpLRp00qSPD09tXDhQtWrV09du3bVunXr9MEHH5i3cR8fnxjLYIyGxImB1BAnO3bs0IIFC+Tl5aUePXpIkh49eqQ5c+aof//+mjlzpho3bmyef/HixWrdurXOnTunLFmyWKtsAAASpcjISFWoUEHu7u5avny5uX3SpEn69NNPlSFDBg0cOFBz585Vt27d5OfnJ2dn5xiP54c6YD3Tpk1Tp06d9Ouvv+qPP/7Q/Pnz5e/vr7Jly0qS7t69q5o1ayokJESrV69W7ty5rVwx4hPdjXhjgYGBatu2rZYsWaInT56Y21OmTKnmzZurSpUq2rt3r6TnI5NLUu3atfXBBx/oypUrVqkZAIDEzNbWVh4eHrp9+7aCg4PNA6P5+fmZDzMfMWKEsmTJon/++SdG4I56PADrWL16tTp06KCVK1eqZcuW+vbbb+Xj46Nq1appz549kqR06dLp999/V4oUKeTj42MejwFJA6Ebb8zT01MrV66Uu7u7Vq5cqaNHj5qnpU6dWmnTptWFCxckyTwy+dixY3XixAnlyJHDKjUDAJDYFSlSRAcPHtS+ffteej7nnTt3lDp1auXLl88K1QF4lVKlSmnHjh2qW7euJMnd3V1jx459ZfB+9OiRhg8fbs2SEc84vBxxduLECbVo0UIffvihevToocKFC+vRo0eqXr268ufPr2nTppnn3bNnj5IlS6ZixYpZsWIAABKfqHM7g4ODVbduXf35559auHChvL295ejoKKPRqODgYH322WcKDg7Wrl276NkGEojXndpx9+5d9ezZUytWrNCmTZtUpkwZSVJQUJBSpEjBdpyEELrxVo4eParPP/9c9+/fV/HixeXg4KDLly9r3759cnBwMF8/kIHTAAB4e3/88Yd69eqlc+fOqVmzZvL29tbFixe1a9cu3bt3T4cOHZK9vT3ncAOJxN27d9WrVy+tXr1aq1atUsWKFc3T2I6TDkI33tqpU6dUt25dZcqUSc2aNVOHDh0kSeHh4bK3t7dydQAAJC6vu8qHyWTSlStXNHr0aK1fv163b99WyZIlVbhwYY0dO1Z2dnaKiIgwn94FIOG7e/euvvzySz158kRbt261djmwAEI34sWxY8fUoUMHFSpUSH379lWuXLmsXRIAAInK4sWLlSdPHhUpUiRW89+/f19PnjxR+vTpzb1h9IwB1heX7TAoKEgpU6bk6NAkiv8q4kXhwoU1efJkHT9+XF9//bXOnTtn7ZIAAEg0pk2bpmbNmunu3bv/OW/UqVtp0qRRpkyZzD/uTSYTgRuwov3790t6frWAyMjIN3qsq6urbGxszNs3khZCN+JNkSJFNHHiRN28eVOurq7WLgcAgETh119/VadOnbRmzRpVqVLlP+eP6gmLOlgx6kf6y0Y0B/BuXLt2TbVq1VKDBg0kxS14S4yDlFTxX0W8KlGihDZu3Kj06dNbuxQAABK8JUuWyNfXV6NHj1adOnUk/S9Mv07UiOaSFBwcbNEaAfy3tGnTaty4cTp27JiaNWsm6b+Dd/RtnR7upI3QjXjn5ORk7RIAAEjwpk6dqmbNmilFihRavXq1jh07Jum/e6yjB+5x48apcOHCevz4saXLBfAKJpNJzs7O8vHx0YgRI7R79241b95c0quDd/TteO7cuVq9ejXBOwkjdAMAALxj06ZNU8eOHbVt2zYFBwfrxo0b8vX1NQfvV4n+Q33q1Kn67rvvNGLECKVIkeIdVA0guqdPn5r/NplMSpYsmerVq6cxY8Zo165d+uyzzyQ9D97RA3X07XjatGlq1aqVHBwcOLQ8CeM/CwAA8I6YTCbduHFD8+fP16pVq/TJJ59Iko4cOaIHDx7I19dXx48ff+Vjowfuvn37mnvLAbxbmzdvVr9+/XTlyhXzdvnv4L1//35169ZNUsyxGKJvx3369NHy5ctVu3Zt66wI3glCNwAAwDvy+PFjeXp6asGCBapXr54k6dmzZ3JxcdHRo0f14MEDtW3b9qXBO3rPWL9+/TRz5kz5+Pi80/oBPLd//35t2rRJkyZN0rVr114I3jVr1lSnTp20d+9eXbx40Tzt3zvOZs6caR58DUkXoRsAAOAdWLp0qZo0aaK8efNq9OjR5ssLOTg4KCIiQilTpjQH73bt2r00eC9dulQdOnQgcANWNmjQILVr107+/v4aN25cjOBtNBrl4uKi6tWr6/jx47p69aqk/+04Gzt2rAYOHMh2/B6xs3YBAAAASd3UqVPVq1cv9erVS3ny5NGMGTN048YNzZw5U66urrKzs4sRvIsXL6569eppy5YtypUrl6TnvWSenp7y9/eP1aXFAMSv3bt36+TJk7pz545atWqlnj17ytbWVrNmzZLBYFC3bt2UOXNm86jkDg4OKlWqlNKlSyfpeRiPjIzU8uXL9fPPPxO43yOEbgAAAAuaM2eOOnXqpN9++03Vq1eXJD158kSzZs3SvXv35OrqKpPJJDs7O4WFhSllypQ6cOCA2rVrp+zZs5uXYzAYVK5cOWutBvBemzFjhgYMGKCMGTPqxIkTmj17tqZMmaJu3brp6dOnWrp0qUJDQ9W3b19lyZJFDx8+VJ8+feTo6KgCBQpIer4N29vba/fu3bK1tbXyGuFdMphiczFIAAAAvBGTyaSrV68qR44cqlWrlmbNmqW0adNKkqpXr65t27Zp5cqVypgxo9KnTy9PT8+XLicyMpIf6IAVzZw5U+3atfu/9u48uOZ7/+P485ycJEQSVJAg1rRElFCtq1MXvUUsvfb9ohWh9l41GFLLaKxVEZlWJCrEEppxg7qoPXpntGptUppSIiEaNEQiSE6+vz/8nEnQ1hbH8nrMZCbnfL+f73mfM/lOzuv7Wb7Ex8fTtGlTAN566y3c3d1t00RCQ0NZs2YNFy9epG7dumRmZpKXl0dCQgKOjo4UFBRodfIXmEK3iIiISDEKDQ1l/vz5vPfee4wePZqhQ4eyb98+mjRpQuXKlVm1ahVVqlShcePGNGzYkEGDBmGxaDCiyNNgy5YttGvXjtDQUEaNGmULz1FRUXz88cckJCTw8ssvA5CQkMD3339PamoqPj4+DB061DZ1ROf0i02hW0REROQxu7NXa8WKFUyYMIFSpUphNpvZu3evbZ5nYmIiJ06cYPLkydSpU4fY2Fj1iIk8JS5cuICvry+NGzdm7ty5+Pn5YTabmTp1KsuXL+e7777Dw8PDtkjanTRSRUBzukVEREQeK8MwbKH5o48+IiUlhbi4OK5du8bYsWPp168feXl5tv3r1atHvXr1CAgIwNnZGZPJVOTWQiJiH4ZhUL58eRITE3nttdcYPnw48fHx7N69m1mzZrFy5UrKly//p8dQ4BZQ6BYRERF5bAqH5YSEBPbu3UtoaCgAgwcP5ubNm8yaNYsyZcowZMgQqlatCtzqGS9RooTtd/V0i9jP8ePHSU1NpVKlSrz88st4enpy4MABGjZsyBtvvMGlS5dYsGABXbt2VU+23BeFbhEREZHH5HbgjouLY8OGDTRo0IA333yTGzdu4OzszIgRIygoKGDu3LmYzWYGDhxIjRo1ioRsBW4R+1m9ejULFy7Ezc2Nbt264efnh9VqxdPTk0OHDvH3v/8dV1dX3n77beBWT7ZGpshfUegWEREReYxu3rzJihUr2L59O/Xr1wfA2dmZvLw8HB0dGTVqFGazmQ8//BBvb28GDx5s54pFBG6tUv7vf/+bxYsX06xZMypVqgTcGrVSu3ZtKlWqREJCAo0aNSIoKIgvvvgCX19fBW75S7qUKiIiIvII7lyT1snJiZiYGPr27Utqaipz587l5s2bODo62uZyjxgxgtjYWAIDA+1RsojcYc+ePUyePJnw8HB69uxpC9y9evWiS5curF27lvPnz9uGmp88eZJu3bpx+vRp+xYuzwT1dIuIiIg8pMLzr8+cOYOLiwt5eXl4eXkxd+5ccnNzWbduHSVLlmTIkCG24O3o6Ei3bt0ArW4s8jT49ttv8fPzo0OHDrbh4gEBAZw9e5auXbuycOFCzGYzPXr0wMvLi3379jF8+HC8vb3tXbo8A9TTLSIiIvIQCq9SPmXKFDp16kTjxo1p06YNy5cvx93dnfDwcHx8fFi1ahWRkZG2wF2YAreIfVmtVrZu3Yqbmxtly5a1DRd/99132bp1K1FRUfTs2ZM5c+YQExPDtWvXqFy5MvHx8Tg4OGC1Wu38DuRpp55uERERkYdw+4v5J598Qnh4OFFRUWRlZZGYmMj7779PRkYGY8eOZeHChYwePZr58+dTsWJFunbtaufKRaQwBwcHKlasyG+//UZWVhZubm6YTCaGDx9u22fGjBns3r2btLQ0XFxc7mov8mcUukVEREQeUnZ2Njt27GD69Ol07tzZ9nz16tUZOXIkdevWpV27doSFhREWFkanTp3sV6yI/KGGDRsyffp09u3bR+vWre/afuHCBcqWLYuvr68dqpNnncm4c/UPEREREbmnO28NdOHCBerWrcu0adMYNmwYhmFgGAZWq5Xu3bvj6elJWFgYTk5Otjaawy3y9Lh9TmdlZfHPf/6T5ORkVq1aRdOmTXF2dqagoICsrCx69+5NVlYWCQkJOn/lgamnW0REROQ+FF40LSMjg3LlylG+fHk6dOjAunXraNeuHdWrVwfA0dERNzc3Ll++XCRwg4aiijxNbl9Ec3d3Z8aMGXz00Ud07tyZPn360LRpU06cOEFCQgKXLl3ihx9+sM3h1nksD0ILqYmIiIj8hcKBOyQkhClTpnDw4EEAWrduTW5uLvPnzyctLQ2TycT169c5e/YslStXtmfZInKHgoKCP9zWtGlTVq9eTe/evdm4cSODBg1i586d1KtXjwMHDuDo6Eh+fr4CtzwwDS8XERERuU/jx48nOjqa0NBQWrZsiaenJwDh4eHExMSQkZGBv78/aWlp5ObmcvjwYSwWy13D0kXkyYqNjaV27do0bNjwvvb//fffuXbtGl5eXraQrR5ueVgK3SIiIiL3IT4+nuHDh7N582bq168P3JrTnZmZySuvvMLx48fZsGEDJ0+exNvbmwkTJmCxWMjPz8di0Yw+EXtZvHgxH3zwAVu3bqVVq1Z/um/hUS2F6cKZPAr9BxARERG5D9evX6dmzZrUqlWL48ePExcXx5IlS3B0dKROnTrExsYybty4Im2sVqsCt4gdLVmyhGHDhrF+/fq/DNyALXDfDtm3Q7gCtzwKzekWERERucO95n0ahsGpU6fo378/rVq1Ijk5mTFjxhAcHExSUhJHjx69q42GoorYz5o1awgKCmLOnDm8++67wK3z+K8U7tXOysoq1hrlxaBLryIiIiKFFB5empKSQl5eHj4+PvTu3ZucnBx++uknunbtSsuWLfHy8iIlJQU3Nzf1aIs8RSIiIhg2bBiurq7Ex8fz9ttv4+/v/5c91oUDd2hoKKGhoSQmJuLq6vokypbnlOZ0i4iIiNzDxIkTWb16Nbm5uTRq1IilS5dSsWJF25dyq9VKTk4Offr04erVq+zateuec0FF5Mm6PYd7165dNG/eHB8fH8qWLUtkZCT+/v5/2K5w4I6IiGDSpEmEhYXRp0+fJ1S5PK/0n0FERESEokPKV69ezapVq5g1axYLFizg1KlTBAQEcOzYMQBu3rzJrFmz6N69O+np6Wzfvh2z2fyntyMSkeJlGAbnzp1jxYoV/Oc//6F58+YAHDx4kMzMTIKCgjhy5Mgfti0cuMeNG0dERIQCtzwW6ukWERERKWT9+vWcO3cOi8VCUFAQAJmZmTRr1gxnZ2dWrlxJnTp1WLlyJYmJiUyfPl2rlIs8Ba5evUqpUqU4e/Ys3t7ewK0LZE5OTly9epWGDRtStmxZoqKiaNCgwT2PsXjxYsaNG8eSJUvo2rXrkyxfnmMK3SIiIiL/7+LFi1SrVo3c3FymTp3K5MmTbT1gly9fplmzZjg5ObFixQp8fX1t7XT/XhH7Wrt2LdHR0Zw4cYI2bdrwr3/9iyZNmgDYLojdDt7lypVj8eLFdwXvtWvX0qtXL+Li4ujSpYs93oY8pzS8XERERF5Yd/Y9eHh4sH//fmrXrs3WrVs5f/48JpMJwzAoU6YMe/fuJTU1lVmzZhVpp8AtYj8REREMHDiQ119/nfbt2xMdHc2cOXO4cuUKgG0kipubG4cOHeLy5ct07NiREydO2I5hGAaenp5s3bpVgVseO/V0i4iIyAupcO/0pUuXsFgsmEwm3N3dSUxMpFWrVvj7+xMTE4OHh4etxzs7O5uSJUsqaIs8BZYtW8bAgQPZtGkTAQEBAAwZMoSlS5dy/PhxatasaTt3b9y4gbOzM1euXGHw4MGsWrVK57E8EerpFhERkRdOdna27ct2SEgIvXv3plGjRgwdOpT4+Hjq1avH9u3bOXr0KP379+fSpUu2Hm9XV1ccHBywWq12fhciLy7DMEhJSWHgwIG0b9+e119/3bYtJSUFgJ9++olDhw7x22+/AeDs7AxA6dKlWbNmjc5jeWIUukVEROSFEhMTw7x58wAIDg5m/vz5DB06lLCwMNLT0xk4cCBpaWn4+fnxzTffkJiYSNu2bbly5UqRe/yqh0zEfkwmE9WqVWPevHkcOXKEsLAwfv/9d3r27MmxY8fo1KkTO3bsICAggPbt2zNkyBAWLVpEfn5+kePoPJYnQUtsioiIyAsjIiKCoUOHsmnTJjIyMtixYwdr167l7bffZuvWrRw4cIBPP/2UKlWqkJeXh5+fHxs2bGDKlCm4ubnZu3wR4dbt/czmW32HH374IR4eHkyYMIHY2FjMZjMHDhzAw8MDgMDAQE6cOMHkyZPJzMxk8ODB9ixdXlAK3SIiIvJCiImJYeTIkXz99de0bduWX375hZSUFPz8/Ni4cSN9+vRh7ty5BAUFkZuby/Lly2ndujX+/v6sX78eKPplX0SePMMwbOfgRx99REpKCnFxcVy7do2xY8fSr18/8vLybPvXq1ePevXqERAQgLOzs22aSOFRKyLFTf81RERE5LkXHR3NgAEDaNGiBe3atQNuze/09fVl0aJF9OvXj7lz5/LBBx8AkJyczLZt20hNTS1yHAVuEfspHJYTEhLYu3cvY8aMAWDw4MHMmDGD9evXEx4ezpkzZ2ztCgoKKFGiBCaTiYKCAgVueeL0n0NERESea5GRkQQGBhIYGEhSUhKjRo0CoGrVqtSoUYNp06YxePBgW+DOyclh4sSJ5OTk8NZbb9mzdBEp5HZYjouLIyoqigYNGvDmm29y48YNAEaMGMG4ceNYvnw5kZGRnDp1Cih6sUwXzsQeNLxcREREnluhoaGMGTOGTZs20bZtWyIiIggODqagoIDw8HCioqK4ePEiS5cuJTc3F0dHRw4dOsTFixc5ePAgZrNZQ8pFniI3b95kxYoVbN++nfr16wO3Rq3k5eXh6OjIqFGjMJvNfPjhh3h7e2sOtzwVdJ9uEREReW7t2bOH9PR0evXqBcCVK1dYs2YNkyZNomfPnoSHhwMwadIkkpOTyc/Pp27dukybNg2LxUJ+fj4Wi/ooROzlXvOvr169ytixY/nvf//LqFGjGD16NE5OTrbgDbd6wzt37qzVyeWpoNAtIiIiz73CX9yzsrKIjY29K3jfuHHDdh9fAKvVqi/sInZUeJTJmTNncHFxIS8vDy8vL7KyshgxYgS//PILffv2ZciQITg6OhYJ3qDzWJ4OunQrIiIiz73CPWXu7u62nu/g4GAcHBxYsGBBkcANun+viD0VXqV8ypQpbNy4kd9//x13d3fGjh1L//79CQ8PZ/jw4axatQqz2UxQUFCRwA06j+XpoNAtIiIiL5zbwdtkMjFkyBBq1qzJ6NGj7V2WiPy/2xfKPvnkE9v6C1lZWSQmJvL++++TkZHB2LFjWbhwIaNHj2b+/PlUrFiRrl272rlykbspdIuIiMgLyd3dne7du1OhQgU6dOhg73JE5A7Z2dns2LGD6dOn07lzZ9vz1atXZ+TIkdStW5d27doRFhZGWFgYnTp1sl+xIn9Cc7pFREREQIumidjZnYumXbhwwbaw4bBhwzAMA8MwsFqtdO/eHU9PT8LCwnBycrK10RxueRrp/hciIiIioMAtYkcFBQW2wJ2RkYHVaqV8+fJ06NCBdevWcfr0aUwmEyaTCUdHR9zc3Lh8+XKRwA2awy1PJ4VuERERERGxm8KrlIeEhDBlyhQOHjwIQOvWrcnNzWX+/PmkpaVhMpm4fv06Z8+epXLlyvYsW+S+aXi5iIiIiIjY3fjx44mOjiY0NJSWLVvi6ekJQHh4ODExMWRkZODv709aWhq5ubkcPnwYi8Vyz3t5izxNFLpFRERERMSu4uPjGT58OJs3b6Z+/frArTndmZmZvPLKKxw/fpwNGzZw8uRJvL29mTBhAhaLRWsxyDNBf6EiIiIiImJX169fp2bNmtSqVYvjx48TFxfHkiVLcHR0pE6dOsTGxjJu3LgibaxWqwK3PBM0p1tERERERJ6YgoKCu54zDINTp07Rv39/WrVqRXJyMmPGjCE4OJikpCSOHj16VxstmibPCl0aEhERERGRJ6LwomkpKSnk5eXh4+ND7969ycnJ4aeffqJr1660bNkSLy8vUlJScHNzU4+2PNM0p1tERERERJ6oiRMnsnr1anJzc2nUqBFLly6lYsWKtkXRrFYrOTk59OnTh6tXr7Jr1y5bWBd51ugvV0REREREilXhIeWrV69m1apVzJo1iwULFnDq1CkCAgI4duwYADdv3mTWrFl0796d9PR0tm/fjtlsvuewdJFngXq6RURERETkiVi/fj3nzp3DYrEQFBQEQGZmJs2aNcPZ2ZmVK1dSp04dVq5cSWJiItOnT9cq5fLMU+gWEREREZFid/HiRapVq0Zubi5Tp05l8uTJtuHkly9fplmzZjg5ObFixQp8fX1t7axWqxZNk2eahpeLiIiIiMhjd2ffnoeHB/v376d27dps3bqV8+fPYzKZMAyDMmXKsHfvXlJTU5k1a1aRdgrc8qxTT7eIiIiIiDxWhXunL126hMViwWQy4e7uTmJiIq1atcLf35+YmBg8PDxsPd7Z2dmULFlSQVueKwrdIiIiIiLy2GRnZ+Pq6gpASEgIe/bs4eTJk/ztb3+je/fudOrUiaSkJFq3bk2DBg2IiYmhXLlytuANGlIuzxeFbhEREREReSxiYmL49ddfmTJlCsHBwSxatIjIyEicnJyYN28ehw8f5ujRo1SpUoWkpCTatm2Lp6cn27Zto3Tp0vYuX6RYaE63iIiIiIg8soiICAYMGMAbb7xBRkYGO3bsYO3atXTu3BmLxcKBAweYPXs2VapUIS8vDz8/PzZs2ICXlxdubm72Ll+k2KinW0REREREHklMTAyBgYHEx8fTrl07fvnlF5o3b86hQ4f4/vvv6dOnD3PnzuWDDz4gNzeX5cuX07p1a2rUqGE7RkFBAWaz+gTl+aO/ahEREREReWjR0dEMGDCAFi1a0K5dOwCcnZ3x9fVl0aJF9OvXzxa4AZKTk9m2bRupqalFjqPALc8r/WWLiIiIiMhDiYyMJDAwkMDAQJKSkhg1ahQAVatWpUaNGkybNo3BgwfbAndOTg4TJ04kJyeHt956y56lizwxFnsXICIiIiIiz57Q0FDGjBnDpk2baNu2LREREQQHB1NQUEB4eDhRUVFcvHiRpUuXkpubi6OjI4cOHeLixYscPHgQs9msIeXyQtCcbhEREREReWB79uwhPT2dXr16AXDlyhXWrFnDpEmT6NmzJ+Hh4QBMmjSJ5ORk8vPzqVu3LtOmTcNisZCfn4/Foj5Aef4pdIuIiIiIyEMrfH/trKwsYmNj7wreN27cwNnZ2dZG9+GWF4kuLYmIiIiIyEO7HbgB3N3dbT3fwcHBODg4sGDBgiKBG1DglheKQreIiIiIiDw2t4O3yWRiyJAh1KxZk9GjR9u7LBG70fByERERERF57C5fvsyePXvo0KGDerblhabQLSIiIiIixUqLpsmLTKFbREREREREpJjopngiIiIiIiIixUShW0RERERERKSYKHSLiIiIiIiIFBOFbhEREREREZFiotAtIiIiIiIiUkwUukVERERERESKiUK3iIiIiIiISDFR6BYREREREREpJgrdIiIiz4Hdu3djMpm4fPmyvUuxee+99+jUqZO9yxAREbErhW4REZFH8N5772EymWw/5cqVIyAggKNHj9q7tGdKdHR0kc/xXj+nT5+2d5kiIiIPTKFbRETkEQUEBJCenk56ejo7duzAYrHQoUMHe5dlN1arlYKCggdq07NnT9tnmJ6eTtOmTQkKCirynLe3dzFVLCIiUnwUukVERB6Rs7Mznp6eeHp64u/vz4QJE0hNTeXChQu2fVJTU+nRowdlypThpZdeomPHjkV6bm8Pxf7000/x8vKiXLlyDB8+nLy8PNs+N27cYPz48Xh7e+Ps7IyPjw9LliwpUsuBAwdo3LgxLi4uvPnmm/z888+2bVOnTsXf358vv/ySqlWr4urqyrBhw7BarcyZMwdPT08qVKhASEhIkWN+9tlnvPrqq5QqVQpvb2+GDRtGdna2bXt0dDRlypRhw4YN1K1bF2dnZ86cOXPX57R//37Kly/P7Nmz79pWsmRJ22fo6emJk5MTLi4ueHp68s033+Dn50d+fn6RNp06daJfv35F3ltERATe3t64uLjQo0cPrly5UqRNVFQUvr6+lChRgjp16vD555/fVYuIiMjjpNAtIiLyGGVnZ7NixQp8fHwoV64cAHl5ebRp0wY3Nzf27t3L//73P1xdXQkICODmzZu2trt27eLkyZPs2rWLZcuWER0dTXR0tG17//79Wb16NWFhYRw7doyIiAhcXV2LvP6kSZOYN28eP/zwAxaLhYEDBxbZfvLkSTZv3syWLVtYvXo1S5YsoX379qSlpbFnzx5mz55NcHAw3333na2N2WwmLCyMpKQkli1bxs6dOxk3blyR4167do3Zs2cTFRVFUlISFSpUKLJ9586dtGrVipCQEMaPH/9An2n37t2xWq1s2LDB9lxGRgabNm0q8v5OnDjB2rVr2bhxI1u2bOHQoUMMGzbMtn3lypVMnjyZkJAQjh07xowZM/j4449ZtmzZA9UjIiLyQAwRERF5aAMGDDAcHByMUqVKGaVKlTIAw8vLyzhw4IBtn5iYGKN27dpGQUGB7bkbN24YJUuWNLZu3Wo7TrVq1Yz8/HzbPt27dzd69uxpGIZh/PzzzwZgbNu27Z517Nq1ywCM7du3257btGmTARi5ubmGYRjGlClTDBcXFyMrK8u2T5s2bYzq1asbVqvV9lzt2rWNmTNn/uF7/uqrr4xy5crZHi9dutQAjMOHD9/12XTs2NFYt26d4erqasTGxv7hMe/UvHlzY/To0bbHQ4cONdq2bWt7PG/ePKNmzZq2z3TKlCmGg4ODkZaWZttn8+bNhtlsNtLT0w3DMIxatWoZq1atKvI606dPN5o2bXrfdYmIiDwoi10Tv4iIyHOgZcuWfPHFFwBkZmby+eef07ZtW77//nuqVavGkSNHOHHiBG5ubkXaXb9+nZMnT9oe+/n54eDgYHvs5eXFjz/+CMDhw4dxcHCgefPmf1pL/fr1i7SHW73CVatWBaB69epF6qhYsSIODg6YzeYiz2VkZNgeb9++nZkzZ3L8+HGysrLIz8/n+vXrXLt2DRcXFwCcnJyKvPZt3333HV9//TVxcXGPtJJ5UFAQr7/+OmfPnqVy5cpER0fbFrG7rWrVqlSuXNn2uGnTphQUFPDzzz/j5ubGyZMnCQwMJCgoyLZPfn4+pUuXfui6RERE/opCt4iIyCMqVaoUPj4+tsdRUVGULl2ayMhIPvnkE7Kzs3nttddYuXLlXW3Lly9v+93R0bHINpPJZFuQrGTJkvdVS+Fj3A6khRc1u9dr/Nnrnj59mg4dOjB06FBCQkJ46aWX+PbbbwkMDOTmzZu20F2yZMkiAfi2WrVqUa5cOb788kvat29/12vdr4YNG9KgQQOWL19O69atSUpKYtOmTffd/vYc9MjISJo0aVJkW+ELHSIiIo+bQreIiMhjZjKZMJvN5ObmAtCoUSPWrFlDhQoVcHd3f6hjvvrqqxQUFLBnzx7eeeedx1nunzpw4AAFBQXMmzfP1hu+du3a+27v4eHBunXraNGiBT169GDt2rUPHbwHDRpEaGgoZ8+e5Z133rlrNfMzZ85w7tw5KlWqBMC+ffswm83Url2bihUrUqlSJX799Vf69u37UK8vIiLyMLSQmoiIyCO6ceMG58+f5/z58xw7doyRI0eSnZ3Nu+++C0Dfvn3x8PCgY8eO7N27l1OnTrF7925GjRpFWlrafb1G9erVGTBgAAMHDiQ+Pt52jAcJwA/Dx8eHvLw8Fi5cyK+//kpMTAyLFi16oGNUqFCBnTt3cvz4cXr37n3XKuT3q0+fPqSlpREZGXnXAnEAJUqUYMCAARw5coS9e/cyatQoevTogaenJwDTpk1j5syZhIWFkZyczI8//sjSpUv57LPPHqoeERGR+6HQLSIi8oi2bNmCl5cXXl5eNGnShP379/PVV1/RokULAFxcXEhISKBq1ap06dIFX19fAgMDuX79+gP1fH/xxRd069aNYcOGUadOHYKCgsjJySmmd3VLgwYN+Oyzz5g9ezb16tVj5cqVzJw584GP4+npyc6dO/nxxx/p27cvVqv1gY9RunRpunbtiqur6z3nh/v4+NClSxfatWtH69atqV+/fpFbgg0aNIioqCiWLl3Kq6++SvPmzYmOjqZGjRoPXIuIiMj9MhmGYdi7CBEREZH78Y9//AM/Pz/CwsKKPD916lTi4+M5fPiwfQoTERH5A5rTLSIiIk+9zMxMdu/eze7du4v0XouIiDztFLpFRETkqdewYUMyMzOZPXs2tWvXtnc5IiIi903Dy0VERERERESKiRZSExERERERESkmCt0iIiIiIiIixUShW0RERERERKSYKHSLiIiIiIiIFBOFbhEREREREZFiotAtIiIiIiIiUkwUukVERERERESKiUK3iIiIiIiISDFR6BYREREREREpJv8Ht6JQFr5TJ58AAAAASUVORK5CYII=", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ + "total_time.index = [\"pytorch_model\", \"TensorRT\", \"TensorRT_GPU_Transform\", \"TensorRT_GPU_Transform_GDS\"]\n", + "\n", "plt.figure(figsize=(10, 6))\n", - "average_time.plot(kind='bar', color=['skyblue', 'orange', 'green', 'red'])\n", - "plt.title('Average Inference Time for Each Benchmark Type')\n", + "total_time.plot(kind='bar', color=['skyblue', 'orange', 'green', 'red'])\n", + "plt.title('Total Inference Time for Each Benchmark Type')\n", "plt.xlabel('Benchmark Type')\n", - "plt.ylabel('Average Time (seconds)')\n", + "plt.ylabel('Total Time (seconds)')\n", "plt.xticks(rotation=45)\n", "plt.tight_layout()\n", "plt.show()" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Compare the Original model and the most optimized model\n", + "\n", + "If we plot all the scatter points comparing the original model to the most optimized model, it becomes evident that larger files benefit significantly more from our optimizations.\n", + "\n", + "With the file size increasing, the inference time of the original model increases significantly, while the inference time of the most optimized model does not show obvious increase. This indicates that our approach is particularly effective for handling larger datasets." + ] + }, { "cell_type": "code", - "execution_count": null, + "execution_count": 25, "metadata": {}, - "outputs": [], - "source": [] + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA1cAAAIjCAYAAADvBuGTAAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjEsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvc2/+5QAAAAlwSFlzAAAPYQAAD2EBqD+naQAAjX9JREFUeJzs3XlclFX7x/HvgLKogKIioIi4pCLmmoZLZm644FI9lbmmbaaZaZtPi5KWZZlabqWlle09ZWpJpplLbplpGuYWaiqIiYKouDD37w9+TI6sgzPMAJ/368Wr5r7P3HPNcRjmmnPOdUyGYRgCAAAAAFwXN2cHAAAAAAAlAckVAAAAANgByRUAAAAA2AHJFQAAAADYAckVAAAAANgByRUAAAAA2AHJFQAAAADYAckVAAAAANgByRUAAAAA2AHJFYBSxWQyaeLEic4O47p9+OGHatCggcqWLauKFSs69LEmTpwok8lUqPsuWrRIJpNJhw4dsm9QVzl06JBMJpMWLVrksMew1a233qpbb73V2WGUGLVq1dLQoUPtek1nvBdcz+9SUbqe39vi8hwBRyG5AkqZgwcP6qGHHlLt2rXl5eUlX19ftW3bVjNnztSFCxecHR4K4M8//9TQoUNVp04dzZ8/X++8846zQ0IJMGfOHKcmqBs3btTEiRN15swZp8UAANerjLMDAFB0vv32W/3nP/+Rp6enBg8erIiICF26dEkbNmzQk08+qT/++KPEf1C/cOGCypQp3m99P/30k8xms2bOnKm6des6/PGee+45PfPMM4W676BBg3TPPffI09PTzlHB3ubMmaMqVarYfYSooDZu3KiYmBgNHTo022js3r175eZm3++DS8J7AQDXw7sKUErEx8frnnvuUWhoqH788UcFBQVZzo0cOVIHDhzQt99+68QIHcdsNuvSpUvy8vKSl5eXs8O5bklJSZLk8OmA586dU/ny5VWmTJlCfwh1d3eXu7u7nSNDaeOI5LwkvBcAcD1MCwRKialTpyotLU3vvvuuVWKVpW7dunrssccst69cuaJJkyapTp068vT0VK1atfTf//5XFy9etLpfrVq11KtXL/30009q2bKlvL291bhxY/3000+SpK+++kqNGzeWl5eXWrRood9++83q/kOHDlWFChX0119/qVu3bipfvryCg4P14osvyjAMq7avv/662rRpo8qVK8vb21stWrTQl19+me25mEwmjRo1Sh999JEaNWokT09PxcbGWs5dvc7i7NmzGjNmjGrVqiVPT08FBASoS5cu2r59u9U1v/jiC7Vo0ULe3t6qUqWKBg4cqGPHjuX4XI4dO6a+ffuqQoUKqlq1qp544gllZGTk8i9jbc6cOZaYg4ODNXLkSKtpUrVq1dKECRMkSVWrVi3QupEff/xR7du3V/ny5VWxYkX16dNHe/bssWqTtU4iLi5O9957rypVqqR27dpZnbvahQsXNHr0aFWpUkU+Pj7q3bu3jh07li2enNZuZL1mNmzYoFatWsnLy0u1a9fWBx98YPUYycnJeuKJJ9S4cWNVqFBBvr6+6t69u3bu3FmgvrxWQa/3008/yWQy6fPPP9dLL72kGjVqyMvLS506ddKBAweyXfedd95RnTp15O3trVatWmn9+vUFjinrtfrFF18oPDxc3t7eioyM1K5duyRJb7/9turWrSsvLy/deuutOa6BKchrMzExUffdd59q1KghT09PBQUFqU+fPpbr1apVS3/88YfWrl0rk8kkk8mU75qxc+fOady4cQoJCZGnp6fq16+v119/Pdvv7dW/j/Xr17e8F6xbt87SZuLEiXryySclSWFhYZYYro7v6hG1rNfVhg0bNHr0aFWtWlUVK1bUQw89pEuXLunMmTMaPHiwKlWqpEqVKumpp57KMa6s12rWur3cfq62ZcsWRUVFyc/PT+XKlVOHDh30888/Z+ufDRs26KabbpKXl5fq1Kmjt99+O8/+vNqtt96qiIgI/f777+rQoYPKlSununXrWt7v1q5dq9atW8vb21v169fXqlWrsl3jt99+U/fu3eXr66sKFSqoU6dO2rx5c7Z2f/zxh2677TZ5e3urRo0amjx5ssxmc45xrVixwvJe4uPjo549e+qPP/4o8PMCSgUDQKlQvXp1o3bt2gVuP2TIEEOSceeddxqzZ882Bg8ebEgy+vbta9UuNDTUqF+/vhEUFGRMnDjRmD59ulG9enWjQoUKxuLFi42aNWsar7zyivHKK68Yfn5+Rt26dY2MjAyrx/Hy8jLq1atnDBo0yJg1a5bRq1cvQ5Lx/PPPWz1WjRo1jEceecSYNWuW8cYbbxitWrUyJBnLly+3aifJaNiwoVG1alUjJibGmD17tvHbb79Zzk2YMMHS9t577zU8PDyMsWPHGgsWLDBeffVVIzo62li8eLGlzcKFCw1Jxk033WRMnz7deOaZZwxvb2+jVq1axunTp7M9l0aNGhnDhg0z5s6da9xxxx2GJGPOnDn59vmECRMMSUbnzp2Nt956yxg1apTh7u5u3HTTTcalS5cMwzCMr7/+2ujXr58hyZg7d67x4YcfGjt37sz1mj/88INRpkwZ44YbbjCmTp1qxMTEGFWqVDEqVapkxMfHZ3vs8PBwo0+fPsacOXOM2bNnW5272l133WVIMgYNGmTMnj3buOuuu4wmTZpk69+svrv6sbJeM9WqVTP++9//GrNmzTKaN29umEwmY/fu3ZZ2v/zyi1GnTh3jmWeeMd5++23jxRdfNKpXr274+fkZx44ds7SLj483JBkLFy7Ms38Ler01a9YYkoxmzZoZLVq0MKZPn25MnDjRKFeunNGqVSuray5YsMCQZLRp08Z48803jTFjxhgVK1Y0ateubXTo0CHPeAwj8/V44403GiEhIVa/JzVr1jRmzZplhIeHG9OmTTOee+45w8PDw+jYsaPV/Qv62mzTpo3h5+dnPPfcc8aCBQuMl19+2ejYsaOxdu1awzAyX1c1atQwGjRoYHz44YfGhx9+aKxcuTLXuM1ms3HbbbcZJpPJuP/++41Zs2YZ0dHRhiRjzJgx2Z5jRESEUaVKFePFF180Xn31VSM0NNTw9vY2du3aZRiGYezcudPo37+/IcmYPn26JYa0tDTDMDJfM0OGDMn2vJs2bWpERUUZs2fPNgYNGmRIMp566imjXbt2xr333mvMmTPH8n7y/vvvZ4sr67WalpZmecysn/fee8/w8/MzqlatarnP6tWrDQ8PDyMyMtKYNm2aMX36dOPGG280PDw8jC1btlja/f7774a3t7dRs2ZNY8qUKcakSZOMatWqGTfeeGO236WcdOjQwQgODjZCQkKMJ5980njrrbeM8PBww93d3fj000+NwMBAY+LEicaMGTMsr+HU1FTL/Xfv3m2UL1/eCAoKMiZNmmS88sorRlhYmOHp6Wls3rzZ0i4hIcGoWrWqUalSJWPixInGa6+9ZtSrV88S59W/tx988IFhMpmMqKgo46233jJeffVVo1atWkbFihVzfC8BSite/UApkJKSYkgy+vTpU6D2O3bsMCQZ999/v9XxJ554wpBk/Pjjj5ZjoaGhhiRj48aNlmPff/+9Icnw9vY2Dh8+bDn+9ttvG5KMNWvWWI5lJXGPPvqo5ZjZbDZ69uxpeHh4GCdPnrQcP3/+vFU8ly5dMiIiIozbbrvN6rgkw83Nzfjjjz+yPbdrP/z7+fkZI0eOzLUvLl26ZAQEBBgRERHGhQsXLMeXL19uSDJeeOGFbM/lxRdftLpG1of0vCQlJRkeHh5G165drZLPWbNmGZKM9957z3Is68PL1X2Tm6ZNmxoBAQHGqVOnLMd27txpuLm5GYMHD852zf79+2e7xrUfln799dccP0QPHTq0wMmVJGPdunVWz9/T09MYN26c5Vh6erpVXxhGZiLl6elp1ccFTa4Ker2s5Kphw4bGxYsXLcdnzpxpSLIkBFmvjaZNm1q1e+eddwxJBU6uPD09rfon6/ckMDDQ6gPz+PHjrfqyoK/N06dPG5KM1157Lc9YGjVqVKCYDcMwlixZYkgyJk+ebHX8zjvvNEwmk3HgwAGr5yjJ2LZtm+XY4cOHDS8vL6Nfv36WY6+99lq210qW3JKrbt26GWaz2XI8MjLSMJlMxsMPP2w5duXKFaNGjRrZntu1r9VrPfLII4a7u7vl/c5sNhv16tXL9pjnz583wsLCjC5duliO9e3b1/Dy8rJ6/4uLizPc3d0LnFxJMj7++GPLsT///NPy3nZ1gpT1fnv1679v376Gh4eHcfDgQcux48ePGz4+PsYtt9xiOTZmzBhDklVimJSUZPj5+Vn9W5w9e9aoWLGi8cADD1jFmZiYaPj5+VkdJ7lCace0QKAUSE1NlST5+PgUqP13330nSRo7dqzV8XHjxklStrVZ4eHhioyMtNxu3bq1JOm2225TzZo1sx3/66+/sj3mqFGjLP+fNY3o0qVLVtNdvL29Lf9/+vRppaSkqH379tmm8ElShw4dFB4ens8zzVy3tGXLFh0/fjzH89u2bVNSUpIeeeQRqzUaPXv2VIMGDXJcp/bwww9b3W7fvn2Oz/lqq1at0qVLlzRmzBirhfsPPPCAfH19C7UeLiEhQTt27NDQoUPl7+9vOX7jjTeqS5culn/nvGLPSdYUy0ceecTq+KOPPlrg2MLDw9W+fXvL7apVq6p+/fpW/eTp6Wnpi4yMDJ06dUoVKlRQ/fr1c/w3z4+t17vvvvvk4eFhuZ0Vb1aMWa+Nhx9+2Krd0KFD5efnV+C4OnXqpFq1alluZ/2e3HHHHVa/s9f+/hT0tent7S0PDw/99NNPOn36dIHjyst3330nd3d3jR492ur4uHHjZBiGVqxYYXU8MjJSLVq0sNyuWbOm+vTpo++//77AU2ZzMnz4cKtpe61bt5ZhGBo+fLjlmLu7u1q2bJnv7+DVPvjgA82ZM0dTp05Vx44dJUk7duzQ/v37de+99+rUqVP6559/9M8//+jcuXPq1KmT1q1bJ7PZrIyMDH3//ffq27ev1ftfw4YN1a1btwLHUKFCBd1zzz2W2/Xr11fFihXVsGFDy2sh6zlL/74uMjIytHLlSvXt21e1a9e2tAsKCtK9996rDRs2WP4mfPfdd7r55pvVqlUrS7uqVatqwIABVrH88MMPOnPmjPr372953v/884/c3d3VunVrrVmzpsDPCyjpSK6AUsDX11dS5vqigjh8+LDc3NyyVaILDAxUxYoVdfjwYavjV3+AkGT5YBkSEpLj8Ws/4Lm5uVl9CJCkG264QZKs1pgsX75cN998s7y8vOTv76+qVatq7ty5SklJyfYcwsLC8nuakjLXou3evVshISFq1aqVJk6caPUhLOu51q9fP9t9GzRokK0vvLy8VLVqVatjlSpVyvdDbW6P4+Hhodq1a2d7nILIK/aGDRtaPhherSD9lvX6uLatLZULr33NSNn7yWw2a/r06apXr548PT1VpUoVVa1aVb///nuO/+b5sfV618ZYqVIlSf++frP6t169elbtypYtm+31nJfC/v4U9LXp6empV199VStWrFC1atV0yy23aOrUqUpMTCxwjNc6fPiwgoODs31h07BhQ6vYslzbR1Lm7/j58+d18uTJQsdhS98VNLHcsWOHHn74YfXv39/qC6b9+/dLkoYMGaKqVata/SxYsEAXL15USkqKTp48qQsXLuT4nHP6t8pNjRo1sq338vPzy/d1cfLkSZ0/fz7X33uz2ay///5bUua/U0HizHrut912W7bnvnLlSkuRHQBUCwRKBV9fXwUHB2v37t023a+gG0HmVg0ut+PGNQvLC2L9+vXq3bu3brnlFs2ZM0dBQUEqW7asFi5cqI8//jhb+6tHufJy1113qX379vr666+1cuVKvfbaa3r11Vf11VdfqXv37jbHWdwr4xW0365XQV4bL7/8sp5//nkNGzZMkyZNkr+/v9zc3DRmzJhcF9znxdbr2fP1m5ei+P0ZM2aMoqOjtWTJEn3//fd6/vnnNWXKFP34449q1qyZzddzFbb0XUH67fTp07rjjjt0ww03aMGCBVbnsl4jr732mpo2bZrj/StUqJCt6E9hFcXroqCynvuHH36owMDAbOcpaQ/8i98GoJTo1auX3nnnHW3atMlqCl9OQkNDZTabtX//fss30ZJ04sQJnTlzRqGhoXaNzWw266+//rKMVknSvn37JMkyXep///ufvLy89P3331uVZV64cOF1P35QUJAeeeQRPfLII0pKSlLz5s310ksvqXv37pbnunfvXt12221W99u7d6/d+uLqx7l61OPSpUuKj49X586dr+ua1/rzzz9VpUoVlS9fvlDXNZvNio+Pt/rWO6dKetfjyy+/VMeOHfXuu+9aHT9z5oyqVKni9Otl9e/+/futXhuXL19WfHy8mjRpYvM1C/P4BX1t1qlTR+PGjdO4ceO0f/9+NW3aVNOmTdPixYslFfzLlKzHXrVqlc6ePWs1evXnn39axZYla+Tjavv27VO5cuUsI722PL4jmM1mDRgwQGfOnNGqVatUrlw5q/N16tSRlPllVV6/j1WrVpW3t3eOzzmn30V7q1q1qsqVK5fr772bm5tl9Cs0NLRAcWY994CAgEK9FwGlCdMCgVLiqaeeUvny5XX//ffrxIkT2c4fPHhQM2fOlCT16NFDkjRjxgyrNm+88YakzDUd9jZr1izL/xuGoVmzZqls2bLq1KmTpMxva00mk9X6jEOHDmnJkiWFfsyMjIxs08ECAgIUHBxs+fa5ZcuWCggI0Lx586y+kV6xYoX27Nljt77o3LmzPDw89Oabb1p9A/3uu+8qJSWlUI8TFBSkpk2b6v3337cq5757926tXLnS8u9sq6x1I3PmzLE6/tZbbxXqerlxd3fP9m38F198ka3MuLOu17JlS1WtWlXz5s3TpUuXLMcXLVpk1d+OUtDX5vnz55Wenm513zp16sjHx8fqfuXLly9w3D169FBGRobV760kTZ8+XSaTKduo76ZNm6zWtf3999/65ptv1LVrV8tITFaiXxR9l5OYmBh9//33+uSTT3KcHtuiRQvVqVNHr7/+utLS0rKdz5re6O7urm7dumnJkiU6cuSI5fyePXv0/fffO+4J/D93d3d17dpV33zzjdW06hMnTujjjz9Wu3btLFPFe/Tooc2bN2vr1q1Wz+Ojjz6yuma3bt3k6+url19+WZcvX872mNcztRMoaRi5AkqJOnXq6OOPP9bdd9+thg0bavDgwYqIiNClS5e0ceNGffHFF5Z9ZJo0aaIhQ4bonXfe0ZkzZ9ShQwdt3bpV77//vvr27WtZ4G0vXl5eio2N1ZAhQ9S6dWutWLFC3377rf773/9avtXu2bOn3njjDUVFRenee+9VUlKSZs+erbp16+r3338v1OOePXtWNWrU0J133qkmTZqoQoUKWrVqlX755RdNmzZNUub6mVdffVX33XefOnTooP79++vEiROaOXOmatWqpccff9wufVC1alWNHz9eMTExioqKUu/evbV3717NmTNHN910kwYOHFio67722mvq3r27IiMjNXz4cF24cEFvvfWW/Pz88t0fKzctWrTQHXfcoRkzZujUqVO6+eabtXbtWstoo71GIHr16qUXX3xR9913n9q0aaNdu3bpo48+smk9kyOvV7ZsWU2ePFkPPfSQbrvtNt19992Kj4/XwoULC31NWx+/IK/Nffv2qVOnTrrrrrsUHh6uMmXK6Ouvv9aJEyesCia0aNFCc+fO1eTJk1W3bl0FBARkGxHLEh0drY4dO+rZZ5/VoUOH1KRJE61cuVLffPONxowZYxnpyBIREaFu3bpp9OjR8vT0tCTmMTExVo8vSc8++6zuuecelS1bVtHR0YUaXbXVrl27NGnSJN1yyy1KSkqyjOZlGThwoNzc3LRgwQJ1795djRo10n333afq1avr2LFjWrNmjXx9fbVs2TLL84qNjVX79u31yCOP6MqVK3rrrbfUqFGjQr9f2WLy5Mn64Ycf1K5dOz3yyCMqU6aM3n77bV28eFFTp061tHvqqaf04YcfKioqSo899pjKly+vd955R6GhoVZx+vr6au7cuRo0aJCaN2+ue+65R1WrVtWRI0f07bffqm3bttkSbaDUck6RQgDOsm/fPuOBBx4watWqZXh4eBg+Pj5G27ZtjbfeestIT0+3tLt8+bIRExNjhIWFGWXLljVCQkKM8ePHW7UxjMwSyT179sz2OJKylTjPKpl9dUnoIUOGGOXLlzcOHjxodO3a1ShXrpxRrVo1Y8KECdnKZr/77rtGvXr1DE9PT6NBgwbGwoULcyz7m9NjX30uq/zyxYsXjSeffNJo0qSJ4ePjY5QvX95o0qRJjntSffbZZ0azZs0MT09Pw9/f3xgwYIBx9OhRqzZZz+VatpQmnjVrltGgQQOjbNmyRrVq1YwRI0ZY7Vd09fUKUordMAxj1apVRtu2bQ1vb2/D19fXiI6ONuLi4gp8zZziP3funDFy5EjD39/fqFChgtG3b19j7969hiTjlVdesbTLrRR7Tq+ZDh06WJXLTk9PN8aNG2cEBQUZ3t7eRtu2bY1NmzZla2dLKfaCXC+rFPsXX3xhdf/cHmfOnDmWPYRatmxprFu3Lts1c1PQ35O84srvtfnPP/8YI0eONBo0aGCUL1/e8PPzM1q3bm18/vnnVtdJTEw0evbsafj4+BSolPzZs2eNxx9/3AgODjbKli1r1KtXz3jttdesypRf/RwXL15s+f1t1qyZ1ZYMWSZNmmRUr17dcHNzs3rd5FaK/ZdffrG6f26v45x+N69+L8jq29x+rvbbb78Zt99+u1G5cmXD09PTCA0NNe666y5j9erVVu3Wrl1rtGjRwvDw8DBq165tzJs3r8DvBR06dDAaNWqU7bgt77fbt283unXrZlSoUMEoV66c0bFjR6stM7L8/vvvRocOHQwvLy+jevXqxqRJk4x33303x7L4a9asMbp162b4+fkZXl5eRp06dYyhQ4daldmnFDtKO5NhOHAFJADkY+jQofryyy9znGaD4mXHjh1q1qyZFi9enK2UM0ovk8mkkSNHMrIBoFRgzRUAwGYXLlzIdmzGjBlyc3PTLbfc4oSIAABwPtZcAQBsNnXqVP3666/q2LGjypQpoxUrVmjFihV68MEHs+3DAwBAaUFyBQCwWZs2bfTDDz9o0qRJSktLU82aNTVx4kQ9++yzzg4NAACnYc0VAAAAANgBa64AAAAAwA5IrgAAAADADlhzlQOz2azjx4/Lx8fHbpthAgAAACh+DMPQ2bNnFRwcLDe3vMemSK5ycPz4capdAQAAALD4+++/VaNGjTzbkFzlwMfHR1JmB/r6+jo5GgAAAADOkpqaqpCQEEuOkBeSqxxkTQX09fUluQIAAABQoOVCFLQAAAAAADsguQIAAAAAOyC5AgAAAAA7YM1VIRmGoStXrigjI8PZoQCSpLJly8rd3d3ZYQAAAJRaJFeFcOnSJSUkJOj8+fPODgWwMJlMqlGjhipUqODsUAAAAEolkisbmc1mxcfHy93dXcHBwfLw8GCjYTidYRg6efKkjh49qnr16jGCBQAA4AQkVza6dOmSzGazQkJCVK5cOWeHA1hUrVpVhw4d0uXLl0muAAAAnICCFoXk5kbXwbUwggoAAOBcZAgAAAAAYAckVwAAAABgByRXKJBDhw7JZDJpx44dBb7PokWLVLFiRafH4QiFeW4mk0lLlixxSDwAAABwPpKrUuTvv//WsGHDLFUOQ0ND9dhjj+nUqVP53jckJEQJCQmKiIgo8OPdfffd2rdv3/WEXCi33nqrTCaTXnnllWznevbsKZPJpIkTJxZ5XAAAACjZSK6cJMNsaNPBU/pmxzFtOnhKGWbDoY/3119/qWXLltq/f78++eQTHThwQPPmzdPq1asVGRmp5OTkXO976dIlubu7KzAwUGXKFLzApLe3twICAuwRvs1CQkK0aNEiq2PHjh3T6tWrFRQU5JSYAAAAULKRXDlB7O4EtXv1R/Wfv1mPfbpD/edvVrtXf1Ts7gSHPebIkSPl4eGhlStXqkOHDqpZs6a6d++uVatW6dixY3r22WctbWvVqqVJkyZp8ODB8vX11YMPPpjjdLylS5eqXr168vLyUseOHfX+++/LZDLpzJkzkrJPnZs4caKaNm2qDz/8ULVq1ZKfn5/uuecenT179t++iY1Vu3btVLFiRVWuXFm9evXSwYMHbX6+vXr10j///KOff/7Zcuz9999X165dsyV8p0+f1uDBg1WpUiWVK1dO3bt31/79+63aLFq0SDVr1lS5cuXUr1+/HEf7vvnmGzVv3lxeXl6qXbu2YmJidOXKFZtjBwAArq+ovyhH8UByVcRidydoxOLtSkhJtzqemJKuEYu3OyTBSk5O1vfff69HHnlE3t7eVucCAwM1YMAAffbZZzKMf98UXn/9dTVp0kS//fabnn/++WzXjI+P15133qm+fftq586deuihh6wStNwcPHhQS5Ys0fLly7V8+XKtXbvWavreuXPnNHbsWG3btk2rV6+Wm5ub+vXrJ7PZbNNz9vDw0IABA7Rw4ULLsUWLFmnYsGHZ2g4dOlTbtm3T0qVLtWnTJhmGoR49eujy5cuSpC1btmj48OEaNWqUduzYoY4dO2ry5MlW11i/fr0GDx6sxx57THFxcXr77be1aNEivfTSSzbFDQAAXJ8zvihH8UByVYQyzIZilsUpp+81so7FLIuz+zcf+/fvl2EYatiwYY7nGzZsqNOnT+vkyZOWY7fddpvGjRunOnXqqE6dOtnu8/bbb6t+/fp67bXXVL9+fd1zzz0aOnRovrGYzWYtWrRIERERat++vQYNGqTVq1dbzt9xxx26/fbbVbduXTVt2lTvvfeedu3apbi4OJuf97Bhw/T555/r3LlzWrdunVJSUtSrVy+rNvv379fSpUu1YMECtW/fXk2aNNFHH32kY8eOWYpPzJw5U1FRUXrqqad0ww03aPTo0erWrZvVdWJiYvTMM89oyJAhql27trp06aJJkybp7bfftjluAADgupzxRTmKD5KrIrQ1PjnbL+LVDEkJKenaGp/7+qfrcfXIVH5atmyZ5/m9e/fqpptusjrWqlWrfK9bq1Yt+fj4WG4HBQUpKSnJcnv//v3q37+/ateuLV9fX9WqVUuSdOTIkQLHnqVJkyaqV6+evvzyS7333nsaNGhQtjVje/bsUZkyZdS6dWvLscqVK6t+/fras2ePpc3V5yUpMjLS6vbOnTv14osvqkKFCpafBx54QAkJCTp//rzNsQMAANfjrC/KUXwUvDoBrlvS2dwTq8K0K6i6devKZDJpz5496tevX7bze/bsUaVKlVS1alXLsfLly9s1hixly5a1um0ymaym/EVHRys0NFTz589XcHCwzGazIiIidOnSpUI93rBhwzR79mzFxcVp69at1xV7XtLS0hQTE6Pbb7892zkvLy+HPS4AACg6tnxRHlmnctEFBpfByFURCvAp2IfsgrYrqMqVK6tLly6aM2eOLly4YHUuMTFRH330ke6++26ZTKYCX7N+/fratm2b1bFffvnluuI8deqU9u7dq+eee06dOnWyTFe8Hvfee6927dqliIgIhYeHZzvfsGFDXblyRVu2bMkWR1b7hg0bWp2XpM2bN1vdbt68ufbu3au6detm+3Fz49cMAICSwFlflKP44FNfEWoV5q8gPy/llsKYJAX5ealVmL/dH3vWrFm6ePGiunXrpnXr1unvv/9WbGysunTpourVq9tceOGhhx7Sn3/+qaefflr79u3T559/bil9bkuSdrVKlSqpcuXKeuedd3TgwAH9+OOPGjt2bKGudfU1ExISrNZ1Xa1evXrq06ePHnjgAW3YsEE7d+7UwIEDVb16dfXp00eSNHr0aMXGxur111/X/v37NWvWLMXGxlpd54UXXtAHH3ygmJgY/fHHH9qzZ48+/fRTPffcc9cVPwAAcB3O+qIcxQfJVRFydzNpQnTmaMi16UfW7QnR4XJ3K1xykpd69epp27Ztql27tu666y7VqVNHDz74oDp27KhNmzbJ39+2hC4sLExffvmlvvrqK914442aO3eupVqgp6dnoWJ0c3PTp59+ql9//VURERF6/PHH9dprrxXqWlerWLFintMcFy5cqBYtWqhXr16KjIyUYRj67rvvLFMYb775Zs2fP18zZ85UkyZNtHLlymxJU7du3bR8+XKtXLlSN910k26++WZNnz5doaGh1x0/AABwDc78ohzFg8mwpcpBKZGamio/Pz+lpKTI19fX6lx6erri4+MVFhZW6LU0sbsTFLMszmrObpCflyZEhysqovhucPvSSy9p3rx5+vvvv50dSqlkj9cmAADIW1a1QElWhS2yEq65A5sX689zyC6v3OBaFLRwgqiIIHUJD9TW+GQlnU1XgE/mNxyOGLFypDlz5uimm25S5cqV9fPPP+u1117TqFGjnB0WAACAw0RFBGnuwObZvigPLAFflOP6kVw5ibubqdhXkdm/f78mT56s5ORk1axZU+PGjdP48eOdHRYAAIBDlZQvymF/JFcotOnTp2v69OnODgMAAKDIlYQvymF/FLQAAAAAADsguQIAAAAAOyC5AgAAAAA7ILkCAAAAADugoAUAAAAAl5FhNoptJUaSKwAAAAAuIXZ3QrY9xIKK0R5iTAuE3UycOFFNmza9rmscOnRIJpNJO3bssEtMOVm0aJEqVqzosOsX1NChQ9W3b98Ct//pp59kMpl05swZh8UEAADgLLG7EzRi8XarxEqSElPSNWLxdsXuTnBSZAVHclWK/P333xo2bJiCg4Pl4eGh0NBQPfbYYzp16pTN1zKZTFqyZInVsSeeeEKrV6++rhhDQkKUkJCgiIiI67rO9TKZTDKZTNq8ebPV8YsXL6py5coymUz66aefnBMcAABACZNhNhSzLE5GDueyjsUsi1OGOacWroPkylnMGVL8emnXl5n/NWc49OH++usvtWzZUvv379cnn3yiAwcOaN68eVq9erUiIyOVnJx83Y9RoUIFVa58fZvpubu7KzAwUGXKOH/GakhIiBYuXGh17Ouvv1aFChWcFBEAAEDJtDU+OduI1dUMSQkp6doaf/2fWR2J5MoZ4pZKMyKk93tJ/xue+d8ZEZnHHWTkyJHy8PDQypUr1aFDB9WsWVPdu3fXqlWrdOzYMT377LOWtrVq1dKkSZPUv39/lS9fXtWrV9fs2bOtzktSv379ZDKZLLevnRaYNe3t5ZdfVrVq1VSxYkW9+OKLunLlip588kn5+/urRo0aVgnMtdMChw4dahlFuvona9To4sWLeuKJJ1S9enWVL19erVu3zjaitGjRItWsWVPlypVTv379CjxSN2TIEH366ae6cOGC5dh7772nIUOGZGu7a9cu3XbbbfL29lblypX14IMPKi0tzXI+IyNDY8eOVcWKFVW5cmU99dRTMgzrb17MZrOmTJmisLAweXt7q0mTJvryyy8LFCsAAEBxlnQ298SqMO2cheSqqMUtlT4fLKUetz6empB53AEJVnJysr7//ns98sgj8vb2tjoXGBioAQMG6LPPPrP6sP/aa6+pSZMm+u233/TMM8/oscce0w8//CBJ+uWXXyRJCxcuVEJCguV2Tn788UcdP35c69at0xtvvKEJEyaoV69eqlSpkrZs2aKHH35YDz30kI4ePZrj/WfOnKmEhATLz2OPPaaAgAA1aNBAkjRq1Cht2rRJn376qX7//Xf95z//UVRUlPbv3y9J2rJli4YPH65Ro0Zpx44d6tixoyZPnlygfmvRooVq1aql//3vf5KkI0eOaN26dRo0aJBVu3Pnzqlbt26qVKmSfvnlF33xxRdatWqVRo0aZWkzbdo0LVq0SO+99542bNig5ORkff3111bXmTJlij744APNmzdPf/zxhx5//HENHDhQa9euLVC8AAAAxVWAj5dd2zkLyVVRMmdIsU9Lec0mjX3G7lME9+/fL8Mw1LBhwxzPN2zYUKdPn9bJkyctx9q2batnnnlGN9xwgx599FHdeeedmj59uiSpatWqkqSKFSsqMDDQcjsn/v7+evPNN1W/fn0NGzZM9evX1/nz5/Xf//5X9erV0/jx4+Xh4aENGzbkeH8/Pz8FBgYqMDBQGzdu1Ntvv62vvvpKgYGBOnLkiBYuXKgvvvhC7du3V506dfTEE0+oXbt2ltGwmTNnKioqSk899ZRuuOEGjR49Wt26dStw3w0bNkzvvfeepMwRsB49emR7vh9//LHS09P1wQcfKCIiQrfddptmzZqlDz/8UCdOnJAkzZgxQ+PHj9ftt9+uhg0bat68efLz87Nc4+LFi3r55Zf13nvvqVu3bqpdu7aGDh2qgQMH6u233y5wvAAAAMVRqzB/Bfl5KbeC6yZlVg1sFeZflGHZjOSqKB3emH3EyoohpR7LbOcA105Dy0tkZGS223v27LH5MRs1aiQ3t39fZtWqVVPjxo0tt93d3VW5cmUlJSXleZ3ffvtNgwYN0qxZs9S2bVtJmVPxMjIydMMNN6hChQqWn7Vr1+rgwYOSpD179qh169Z5Pre8DBw4UJs2bdJff/2lRYsWadiwYdna7NmzR02aNFH58uUtx9q2bSuz2ay9e/cqJSVFCQkJVnGUKVNGLVu2tNw+cOCAzp8/ry5dulg9lw8++MDyXAAAAEoqdzeTJkSHS1K2BCvr9oTocJff78r5VQNKk7QT9m1XQHXr1pXJZNKePXvUr1+/bOf37NmjSpUq5TkCVVhly5a1um0ymXI8Zjabc71GYmKievfurfvvv1/Dhw+3HE9LS5O7u7t+/fVXubu7W93HXkUnKleurF69emn48OFKT09X9+7ddfbsWbtc+2pZ67O+/fZbVa9e3eqcp6en3R8PAADA1URFBGnuwObZ9rkKLEb7XJFcFaUK1ezbroAqV66sLl26aM6cOXr88cet1l0lJibqo48+0uDBg2Uy/ftNwLUlyDdv3mw1rbBs2bLKyHBshUNJSk9PV58+fdSgQQO98cYbVueaNWumjIwMJSUlqX379jnev2HDhtqyZYvVsWufW36GDRumHj166Omnn86WxGU9xqJFi3Tu3DnL6NXPP/8sNzc31a9fX35+fgoKCtKWLVt0yy23SJKuXLmiX3/9Vc2bN5ckhYeHy9PTU0eOHFGHDh1sig8AAKCkiIoIUpfwQG2NT1bS2XQF+GROBXT1EassJFdFKbSN5BucWbwix3VXpszzoW3s/tCzZs1SmzZt1K1bN02ePFlhYWH6448/9OSTT6p69ep66aWXrNr//PPPmjp1qvr27asffvhBX3zxhb799lvL+Vq1amn16tVq27atPD09ValSJbvHLEkPPfSQ/v77b61evdpqTZi/v79uuOEGDRgwQIMHD9a0adPUrFkznTx5UqtXr9aNN96onj17avTo0Wrbtq1ef/119enTR99//71iY2NtiiEqKkonT56Ur69vjucHDBigCRMmaMiQIZo4caJOnjypRx99VIMGDVK1apmJ8mOPPaZXXnlF9erVsySKV28G7OPjoyeeeEKPP/64zGaz2rVrp5SUFP3888/y9fXNsUIhAABASeTuZlJknevb3sdZWHNVlNzcpahX//9GLrNJo17JbGdn9erV07Zt21S7dm3dddddqlOnjh588EF17NhRmzZtkr+/9eLAcePGadu2bWrWrJkmT56sN954w6oQxLRp0/TDDz8oJCREzZo1s3u8WdauXauEhASFh4crKCjI8rNxY+a6tIULF2rw4MEaN26c6tevr759++qXX35RzZo1JUk333yz5s+fr5kzZ6pJkyZauXKlnnvuOZtiMJlMqlKlijw8PHI8X65cOX3//fdKTk7WTTfdpDvvvFOdOnXSrFmzLG3GjRunQYMGaciQIYqMjJSPj0+2KZqTJk3S888/rylTpqhhw4aKiorSt99+q7CwMJviBQAAgHOYDFuqHJQSqamp8vPzU0pKSrbRivT0dMXHxyssLExeXoUsBRm3NLNq4NXFLXyrZyZW4b2vI3L7qFWrlsaMGaMxY8Y4OxTYwC6vTQAAAFjJKze4llNHrtatW6fo6GgFBwfLZDJpyZIlVudz2jzWZDLptddey/WaEydOzNY+a08klxHeWxqzWxqyXLrj3cz/jtnlEokVAAAAgMJx6pqrc+fOqUmTJho2bJhuv/32bOcTEhKsbq9YsULDhw/XHXfcked1GzVqpFWrVllulynjgkvL3NylsJyLMAAAAAAofpyadXTv3l3du3fP9XxgYKDV7W+++UYdO3ZU7dq187xumTJlst0XBXfo0CFnhwAAAAAUO8WmoMWJEyf07bffWu1zlJv9+/crODhYtWvX1oABA3TkyJE821+8eFGpqalWPwAAAABgi2KTXL3//vvy8fHJcfrg1Vq3bq1FixYpNjZWc+fOVXx8vNq3b5/nxq9TpkyRn5+f5SckJCTfeKgDAlfDaxIAAMC5ik1y9d5772nAgAH5VkHr3r27/vOf/+jGG29Ut27d9N133+nMmTP6/PPPc73P+PHjlZKSYvn5+++/c21btmxZSdL58+cL90QAB7l06ZIk5bjRMQAAABzPBSs9ZLd+/Xrt3btXn332mc33rVixom644QYdOHAg1zaenp7y9PQs0PXc3d1VsWJFJSUlScrc48hkKh47RqPkMpvNOnnypMqVK+eaBVwAAABKgWLxKezdd99VixYt1KRJE5vvm5aWpoMHD2rQoEF2iyerWEZWggW4Ajc3N9WsWZNkHwAAwEmcmlylpaVZjSjFx8drx44d8vf3V82aNSVlbtr1xRdfaNq0aTleo1OnTurXr59GjRolSXriiScUHR2t0NBQHT9+XBMmTJC7u7v69+9vt7hNJpOCgoIUEBCgy5cv2+26wPXw8PCQm1uxmekLAABQ4jg1udq2bZs6duxouT127FhJ0pAhQ7Ro0SJJ0qeffirDMHJNjg4ePKh//vnHcvvo0aPq37+/Tp06papVq6pdu3bavHmzqlatavf43d3dWd8CAAAAQJJkMigxlk1qaqr8/PyUkpIiX19fZ4cDAAAAwElsyQ2YQwQAAAAAdkByBQAAAAB2UCyqBQIAAAAomAyzoa3xyUo6m64AHy+1CvOXuxvVhIsCyRUAAABQQsTuTlDMsjglpKRbjgX5eWlCdLiiIoKcGFnpwLRAAAAAoASI3Z2gEYu3WyVWkpSYkq4Ri7crdneCkyIrPUiuAAAAgGIuw2woZlmccioDnnUsZlmcMswUCnckkisAAACgmNsan5xtxOpqhqSElHRtjU8uuqBKIZIrAAAAoJhLOpt7YlWYdigckisAAACgmAvw8bJrOxQOyRUAAABQzLUK81eQn5dyK7huUmbVwFZh/kUZVqlDcgUAAAAUc+5uJk2IDpekbAlW1u0J0eHsd+VgJFcAAABACRAVEaS5A5sr0M966l+gn5fmDmzOPldFgE2EAQAAgBIiKiJIXcIDtTU+WUln0xXgkzkVkBGrokFyBQAAAJQg7m4mRdap7OwwSiWmBQIAAACAHZBcAQAAAIAdkFwBAAAAgB2QXAEAAACAHZBcAQAAAIAdkFwBAAAAgB2QXAEAAACAHZBcAQAAAIAdkFwBAAAAgB2QXAEAAACAHZRxdgAAAAAAilaG2dDW+GQlnU1XgI+XWoX5y93N5Oywij2SKwAAAKAUid2doJhlcUpISbccC/Lz0oTocEVFBDkxsuKPaYEAAAAochlmQ5sOntI3O45p08FTyjAbzg6pVIjdnaARi7dbJVaSlJiSrhGLtyt2d4KTIisZGLkCAABAkWLkxDkyzIZilsUppzTWkGSSFLMsTl3CA5kiWEiMXAEAAKDIMHLiPFvjk7P1+9UMSQkp6doan1x0QZUwJFcAAAAoEvmNnEiZIydMEXSMpLO5J1aFaYfsSK4AAABQJBg5ca4AHy+7tkN2JFcAAAAoEoycOFerMH8F+Xkpt9VUJmWufWsV5l+UYZUoJFcAAAAoEoycOJe7m0kTosMlKVuClXV7QnQ4xSyuA8kVAAAAigQjJ84XFRGkuQObK9DPOoEN9PPS3IHNqdZ4nSjFDgAAgCKRNXIyYvF2mSSrwhaMnBSdqIggdQkP1Nb4ZCWdTVeAT2ZCS79fP5NhGJRjuUZqaqr8/PyUkpIiX19fZ4cDAABQorDPFYoTW3IDRq4AAABQpBg5QUlFcgUAAIAi5+5mUmSdys4OA7ArCloAAAAAgB2QXAEAAACAHZBcAQAAAIAdkFwBAAAAgB2QXAEAAACAHZBcAQAAAIAdkFwBAAAAgB2QXAEAAACAHZBcAQAAAIAdkFwBAAAAgB2QXAEAAACAHZBcAQAAAIAdkFwBAAAAgB2QXAEAAACAHTg1uVq3bp2io6MVHBwsk8mkJUuWWJ0fOnSoTCaT1U9UVFS+1509e7Zq1aolLy8vtW7dWlu3bnXQMwAAAACATE5Nrs6dO6cmTZpo9uzZubaJiopSQkKC5eeTTz7J85qfffaZxo4dqwkTJmj79u1q0qSJunXrpqSkJHuHDwAAAJQIGWZDmw6e0jc7jmnTwVPKMBvODqlYKuPMB+/evbu6d++eZxtPT08FBgYW+JpvvPGGHnjgAd13332SpHnz5unbb7/Ve++9p2eeeea64gUAAABKmtjdCYpZFqeElHTLsSA/L02IDldURJATIyt+XH7N1U8//aSAgADVr19fI0aM0KlTp3Jte+nSJf3666/q3Lmz5Zibm5s6d+6sTZs25Xq/ixcvKjU11eoHAAAAKOlidydoxOLtVomVJCWmpGvE4u2K3Z3gpMiKJ5dOrqKiovTBBx9o9erVevXVV7V27Vp1795dGRkZObb/559/lJGRoWrVqlkdr1atmhITE3N9nClTpsjPz8/yExISYtfnAQAAALiaDLOhmGVxymkCYNaxmGVxTBG0gVOnBebnnnvusfx/48aNdeONN6pOnTr66aef1KlTJ7s9zvjx4zV27FjL7dTUVBIsAAAAlGhb45OzjVhdzZCUkJKurfHJiqxTuegCK8ZsSq7MZrPWrl2r9evX6/Dhwzp//ryqVq2qZs2aqXPnzg5PSGrXrq0qVarowIEDOSZXVapUkbu7u06cOGF1/MSJE3mu2/L09JSnp6fd4wUAAChuMsyGtsYnK+lsugJ8vNQqzF/ubiZnhwUHSDqbe2JVmHYo4LTACxcuaPLkyQoJCVGPHj20YsUKnTlzRu7u7jpw4IAmTJigsLAw9ejRQ5s3b3ZYsEePHtWpU6cUFJTzwjoPDw+1aNFCq1evthwzm81avXq1IiMjHRYXAABASRC7O0HtXv1R/edv1mOf7lD/+ZvV7tUfWXdTQgX4eNm1HQo4cnXDDTcoMjJS8+fPV5cuXVS2bNlsbQ4fPqyPP/5Y99xzj5599lk98MAD+V43LS1NBw4csNyOj4/Xjh075O/vL39/f8XExOiOO+5QYGCgDh48qKeeekp169ZVt27dLPfp1KmT+vXrp1GjRkmSxo4dqyFDhqhly5Zq1aqVZsyYoXPnzlmqBwIAACC7rMIG166uySpsMHdgcyrHlTCtwvwV5OelxJT0HNddmSQF+mWOXqJgCpRcrVy5Ug0bNsyzTWhoqMaPH68nnnhCR44cKdCDb9u2TR07drTczlr3NGTIEM2dO1e///673n//fZ05c0bBwcHq2rWrJk2aZDWF7+DBg/rnn38st++++26dPHlSL7zwghITE9W0aVPFxsZmK3IBAACATPkVNjAps7BBl/BApgiWIO5uJk2IDteIxdtlkqz+/bP+lSdEh/NvbgOTYRiU/7hGamqq/Pz8lJKSIl9fX2eHAwAA7IC1RLnbdPCU+s/Pf2nHJw/cTGGDEoh9rvJmS25gc7XA2NhYVahQQe3atZMkzZ49W/Pnz1d4eLhmz56tSpUqFS5qAAAAB+HDY94obFC6RUUEqUt4IF8+2IHN+1w9+eSTlk12d+3apXHjxqlHjx6Kj4+3KmcOAADgCtgkNX8UNoC7m0mRdSqrT9PqiqxTmcSqkGweuYqPj1d4eLgk6X//+5969eqll19+Wdu3b1ePHj3sHiAAwPUx3QquirVEBUNhA8A+bE6uPDw8dP78eUnSqlWrNHjwYEmSv7+/ZUQLAFB6MN0KroxNUguGwgaAfdg8LbBdu3YaO3asJk2apK1bt6pnz56SpH379qlGjRp2DxAA4LqYbgVXx1qigouKCNLcgc0V6Gc99S/Qz4sy7EAB2TxyNWvWLD3yyCP68ssvNXfuXFWvXl2StGLFCkVFRdk9QACAa2K6FYoD1hLZhsIGwPWxObmqWbOmli9fnu349OnT7RIQAKB4YLoVigPWEtkuq7ABANsVKLmyZS0V+0IBQOnAdCsUB6wlQmlAUSHXUaDkqmLFijKZCvYPlJGRcV0BAYAr4Q9W7phu5bp43VrLWkt0beGVQAqvoASgqJBrKVBytWbNGsv/Hzp0SM8884yGDh2qyMhISdKmTZv0/vvva8qUKY6JEgCcgD9YeWO6lWvidZsz1hKhJMoqKnTte3BWUSEKkRQ9k2EYOf1NzFWnTp10//33q3///lbHP/74Y73zzjv66aef7BmfU6SmpsrPz08pKSlMcwRKqdz+YGV9DOMPVqasfpJynm5FPxUtXrdA6ZFhNtTu1R9zXfua9QXXhqdv40uE62RLbmBzKfZNmzapZcuW2Y63bNlSW7dutfVyAOBy8quCJ2VWwcsw2/TdVIlE6WbXwesWKF1sKSqEomNztcCQkBDNnz9fU6dOtTq+YMEChYSE2C0wAHAWquDZhulWroHXLVC6UFTINdmcXE2fPl133HGHVqxYodatW0uStm7dqv379+t///uf3QMEgKLGHyzbUbrZ+XjdAqULRYVck83TAnv06KH9+/crOjpaycnJSk5OVnR0tPbt26cePXo4IkYAKFL8wUJxxOsWKF2yigrlNkfApMxiNhQVKlo2j1xJUo0aNfTyyy/bOxYAcAlUwUNxxOsWKF3Yw801FSq5OnPmjLZu3aqkpCSZzWarc4MHD7ZLYADgLPzBQnHE6xYofdjDzfXYXIp92bJlGjBggNLS0uTr62u1ubDJZFJycvGvSEIpdgAS+wWheOJ1C5Q+bBzuWLbkBjYnVzfccIN69Oihl19+WeXKlbuuQF0VyRWALPzBQnHE6xYA7MehyVX58uW1a9cu1a5d+7qCdGUkVwAAAAAkB28i3K1bN23btq3QwQEAAABASWRzQYuePXvqySefVFxcnBo3bqyyZctane/du7fdggMAAACA4sLmaYFubrkPdplMJmVkZFx3UM7GtEAAAAAAkm25gc0jV9eWXgcAAAAAFGLNFQAAAAAgu0IlV2vXrlV0dLTq1q2runXrqnfv3lq/fr29YwMAAACAYsPm5Grx4sXq3LmzypUrp9GjR2v06NHy9vZWp06d9PHHHzsiRgAAcpVhNrTp4Cl9s+OYNh08pQyzTUuJAQCwG5sLWjRs2FAPPvigHn/8cavjb7zxhubPn689e/bYNUBnoKAFABQPsbsTFLMsTgkp6ZZjQX5emhAdrqiIICdGBhQ9No8GHMOhmwh7enrqjz/+UN26da2OHzhwQBEREUpPT8/lnsUHyRUAuKarPzwe+ue8Zqzap2v/iGV9lJw7sDkJFkoNvmgAHMeh1QJDQkK0evXqbMnVqlWrFBISYuvlAAAokJw+PObEUGaCFbMsTl3CA/nmHiVe7O4EjVi8PdsXDYkp6RqxeDtfNABFyObkaty4cRo9erR27NihNm3aSJJ+/vlnLVq0SDNnzrR7gAAAxyoOU4ly+/CYG0NSQkq6tsYnK7JOZUeGBjhVhtlQzLK4HH83+KIBKHo2J1cjRoxQYGCgpk2bps8//1xS5jqszz77TH369LF7gAAAxykOU4ny+vCYn6SzxX+qOpCXrfHJeY7m8kUDULRsTq4kqV+/furXr5+9YwEAFKHiMpUovw+PeQnw8bJzNIBrKegXCHzRABQNm0ux//LLL9qyZUu241u2bNG2bdvsEhQAwLHym0okZU4lcoWy5oX5UGhS5ghcqzB/+wcEuJCCfoHAFw1A0bA5uRo5cqT+/vvvbMePHTumkSNH2iUoAIBj2TKVyNls/VCYtapkQnQ4a0xQ4rUK81eQn5dye6XzRQNQtGxOruLi4tS8efNsx5s1a6a4uDi7BAUAcKziNJUovw+P1wr083KZKY2Ao7m7mTQhOlySsv2O8EUDUPRsXnPl6empEydOqHbt2lbHExISVKZMoZZwAQCKWHGaSpT14XHE4u0ySVZTGbNuP965nmpVKe+y1Q4BR4qKCNLcgc2zFacJdLHiNEBpYPMmwv3791dCQoK++eYb+fn5SZLOnDmjvn37KiAgwFJBsDhjE2EAJV2G2VC7V39UYkp6juuuTMr8YLbh6dtcJlEpDpUNAWcqDtsqAMWRLbmBzcnVsWPHdMstt+jUqVNq1qyZJGnHjh2qVq2afvjhhxKxkTDJFYDSIKtaoJR9NEiSS06t48MjAKCoOTS5kqRz587po48+0s6dO+Xt7a0bb7xR/fv3V9myZQsdtCshuQJQWjAaBABA3hyeXJV0JFcAShNGgwAAyJ0tuYHN1QIl6cMPP1S7du0UHBysw4cPS5KmT5+ub775pjCXAwA4kbubSZF1KqtP0+qKrFOZxAoAgEKyObmaO3euxo4dq+7du+v06dPKyMiQJFWqVEkzZsywd3wAAAAAUCzYnFy99dZbmj9/vp599lmr0ustW7bUrl277BocAAAAABQXNidX8fHxliqBV/P09NS5c+fsEhQAAAAAFDc2J1dhYWHasWNHtuOxsbFq2LChPWICAAAAgGKnTP5NrI0dO1YjR45Uenq6DMPQ1q1b9cknn2jKlClasGCBI2IEgBKHCn0AAJQ8NidX999/v7y9vfXcc8/p/PnzuvfeexUcHKyZM2fqnnvucUSMAFCisLcUAAAl03Xtc3X+/HmlpaUpICDAnjE5HftcAXCU2N0JGrF4u659480as5o7sDkJFgAALsSh+1xduHBB58+flySVK1dOFy5c0IwZM7Ry5crCRQsApUSG2VDMsrhsiZUky7GYZXHKMLO3OwAAxZHNyVWfPn30wQcfSJLOnDmjVq1aadq0aerTp4/mzp1r9wABoKTYGp9sNRXwWoakhJR0bY1PLrqgAACA3dicXG3fvl3t27eXJH355ZcKDAzU4cOH9cEHH+jNN9+0e4AAUFIknc09sSpMOwAA4FpsTq7Onz8vHx8fSdLKlSt1++23y83NTTfffLMOHz5s07XWrVun6OhoBQcHy2QyacmSJZZzly9f1tNPP63GjRurfPnyCg4O1uDBg3X8+PE8rzlx4kSZTCarnwYNGtj6NAHA7gJ8vOzaDgAAuBabk6u6detqyZIl+vvvv/X999+ra9eukqSkpCSbiz+cO3dOTZo00ezZs7OdO3/+vLZv367nn39e27dv11dffaW9e/eqd+/e+V63UaNGSkhIsPxs2LDBprgAwBFahfkryM9LuRVcNymzamCrMP+iDAsAANiJzaXYX3jhBd177716/PHH1alTJ0VGRkrKHMVq1qyZTdfq3r27unfvnuM5Pz8//fDDD1bHZs2apVatWunIkSOqWbNmrtctU6aMAgMDbYoFABzN3c2kCdHhGrF4u0ySVWGLrIRrQnQ4+10BAFBM2Txydeedd+rIkSPatm2bYmNjLcc7deqk6dOn2zW4a6WkpMhkMqlixYp5ttu/f7+Cg4NVu3ZtDRgwQEeOHMmz/cWLF5Wammr1AwCOEBURpLkDmyvQz3rqX6CfF2XYAQAo5q5rnyt7MplM+vrrr9W3b98cz6enp6tt27Zq0KCBPvroo1yvs2LFCqWlpal+/fpKSEhQTEyMjh07pt27d1vWil1r4sSJiomJyXacfa4AOEqG2dDW+GQlnU1XgE/mVEBGrAAAcD227HNVoOTq4Ycf1nPPPacaNWrk++CfffaZrly5ogEDBhQ8YuWdXF2+fFl33HGHjh49qp9++smmhOfMmTMKDQ3VG2+8oeHDh+fY5uLFi7p48aLldmpqqkJCQkiuAAAAgFLOluSqQGuuqlatqkaNGqlt27aKjo5Wy5YtFRwcLC8vL50+fVpxcXHasGGDPv30UwUHB+udd96xyxORMhOru+66S4cPH9aPP/5oc7JTsWJF3XDDDTpw4ECubTw9PeXp6Xm9oQIAAAAoxQqUXE2aNEmjRo3SggULNGfOHMXFxVmd9/HxUefOnfXOO+8oKirKbsFlJVb79+/XmjVrVLlyZZuvkZaWpoMHD2rQoEF2iwsAAAAArlXgaoHVqlXTs88+q2effVanT5/WkSNHdOHCBVWpUkV16tSRyWT7WoG0tDSrEaX4+Hjt2LFD/v7+CgoK0p133qnt27dr+fLlysjIUGJioiTJ399fHh4ekjILafTr10+jRo2SJD3xxBOKjo5WaGiojh8/rgkTJsjd3V39+/e3OT4AAAAAKCibS7FLUqVKlVSpUqXrfvBt27apY8eOlttjx46VJA0ZMkQTJ07U0qVLJUlNmza1ut+aNWt06623SpIOHjyof/75x3Lu6NGj6t+/v06dOqWqVauqXbt22rx5s6pWrXrd8QIAAABAblymWqArsWXRGgAAAICSy5bcwOZ9rgAAAAAA2ZFcAQAAAIAdkFwBAAAAgB0UKrm6cuWKVq1apbfffltnz56VJB0/flxpaWl2DQ5A6ZBhNrTp4Cl9s+OYNh08pQwzS0EBAEDxY3O1wMOHDysqKkpHjhzRxYsX1aVLF/n4+OjVV1/VxYsXNW/ePEfECaCEit2doJhlcUpISbccC/Lz0oTocEVFBDkxMgAAANvYPHL12GOPqWXLljp9+rS8vb0tx/v166fVq1fbNTgAJVvs7gSNWLzdKrGSpMSUdI1YvF2xuxOcFBkAAIDtbB65Wr9+vTZu3GjZxDdLrVq1dOzYMbsFBqBkyzAbilkWp5wmABqSTJJilsWpS3ig3N1s36QcAACgqNk8cmU2m5WRkZHt+NGjR+Xj42OXoACUfFvjk7ONWF3NkJSQkq6t8clFFxQAAMB1sDm56tq1q2bMmGG5bTKZlJaWpgkTJqhHjx72jA1ACZZ0NvfEqjDtAAAAnM3maYHTpk1Tt27dFB4ervT0dN17773av3+/qlSpok8++cQRMQIogQJ8vOzaDgAAwNlsTq5q1KihnTt36tNPP9Xvv/+utLQ0DR8+XAMGDLAqcAEAeWkV5q8gPy8lpqTnuO7KJCnQz0utwvyLOjQAAIBCsTm5kqQyZcpo4MCB9o4FQCni7mbShOhwjVi8XSbJKsHKKl8xITqcYhYAAKDYKFRydfz4cW3YsEFJSUkym81W50aPHm2XwACUfFERQZo7sHm2fa4C2ecKAAAUQybDMHKakZOrRYsW6aGHHpKHh4cqV64sk+nfb5VNJpP++usvuwdZ1FJTU+Xn56eUlBT5+vo6OxygxMswG9oan6yks+kK8MmcCsiIFQAAcAW25AY2J1chISF6+OGHNX78eLm52VxssFgguQIAAAAg2ZYb2JwdnT9/Xvfcc0+JTawAAAAAoDBszpCGDx+uL774whGxAAAAAECxZfO0wIyMDPXq1UsXLlxQ48aNVbZsWavzb7zxhl0DdAamBQIAAACQbMsNbK4WOGXKFH3//feqX7++JGUraAEAAAAApZHNydW0adP03nvvaejQoQ4IBwAAAACKJ5vXXHl6eqpt27aOiAUAAAAAii2bk6vHHntMb731liNiAQAAAIBiy+ZpgVu3btWPP/6o5cuXq1GjRtkKWnz11Vd2Cw4AAAAAigubk6uKFSvq9ttvd0QsAAAAAFBs2ZxcLVy40BFxAAAAAECxZvOaKwAAAABAdgUauWrevLlWr16tSpUqqVmzZnnuZ7V9+3a7BQcAAAAAxUWBkqs+ffrI09NTktS3b19HxgMAAAAAxZLJMAyjIA2HDRummTNnysfHx9ExOV1qaqr8/PyUkpIiX19fZ4cDAAAAwElsyQ0KvObq/fff14ULF647OADOkWE2tOngKX2z45g2HTylDHOBvlcBAABAARW4WmABB7gAuKDY3QmKWRanhJR0y7EgPy9NiA5XVESQEyMDAAAoOWwqxX727Fl5eXnl2YZpdIBrid2doBGLt+var0cSU9I1YvF2zR3YnAQLAADADmxKrm644YZczxmGIZPJpIyMjOsOCoB9ZJgNxSyLy5ZYSZIhySQpZlmcuoQHyt0t9yqgAAAAyJ9NydWXX34pf39/R8UCwM62xidbTQW8liEpISVdW+OTFVmnctEFBgAAUALZlFy1bdtWAQEBjooFgJ0lnc09sSpMOwAAAOSuwNUCARQ/AT55r5G0tR0AAAByV+DkKjQ0VO7u7o6MBYCdtQrzV5Cfl3JbTWVSZtXAVmFM9wUAALheBU6u4uPjVbkyazKA4sTdzaQJ0eGSlC3Byro9ITqcYhYAAAB2wLRAoISLigjS3IHNFehnPfUv0M+LMuwAAAB2ZFNBCwDFU1REkLqEB2prfLKSzqYrwCdzKiAjVgAAAPZDcgWUEu5uJsqtAwAAONB1TQtMT6d8MwAAAABIhUiuzGazJk2apOrVq6tChQr666+/JEnPP/+83n33XbsHCAAAAADFgc3J1eTJk7Vo0SJNnTpVHh4eluMRERFasGCBXYMDAAAAgOLC5uTqgw8+0DvvvKMBAwZY7XvVpEkT/fnnn3YNDgAAAACKC5uTq2PHjqlu3brZjpvNZl2+fNkuQQEAAABAcWNzchUeHq7169dnO/7ll1+qWbNmdgkKAAAAAIobm0uxv/DCCxoyZIiOHTsms9msr776Snv37tUHH3yg5cuXOyJGACVMhtlgzy0AAFDimAzDMGy90/r16/Xiiy9q586dSktLU/PmzfXCCy+oa9eujoixyKWmpsrPz08pKSny9fV1djhAiRK7O0Exy+KUkPLvVg5Bfl6aEB2uqIggJ0YGAACQnS25QaGSq5KO5ApwjNjdCRqxeLuufdPJGrOaO7A5CRYAAHAptuQGNq+5+uWXX7Rly5Zsx7ds2aJt27bZejkApUSG2VDMsrhsiZUky7GYZXHKMPN9DwAAKJ5sTq5Gjhypv//+O9vxY8eOaeTIkTZda926dYqOjlZwcLBMJpOWLFlidd4wDL3wwgsKCgqSt7e3OnfurP379+d73dmzZ6tWrVry8vJS69attXXrVpviAmB/W+OTraYCXsuQlJCSrq3xyUUXFAAAgB3ZnFzFxcWpefPm2Y43a9ZMcXFxNl3r3LlzatKkiWbPnp3j+alTp+rNN9/UvHnztGXLFpUvX17dunVTenruH9A+++wzjR07VhMmTND27dvVpEkTdevWTUlJSTbFBsC+ks7m/ntbmHYAAACuxubkytPTUydOnMh2PCEhQWXK2FZ8sHv37po8ebL69euX7ZxhGJoxY4aee+459enTRzfeeKM++OADHT9+PNsI19XeeOMNPfDAA7rvvvsUHh6uefPmqVy5cnrvvfdsig2AfQX4eNm1HQAAgKuxObnq2rWrxo8fr5SUFMuxM2fO6L///a+6dOlit8Di4+OVmJiozp07W475+fmpdevW2rRpU473uXTpkn799Ver+7i5ualz58653keSLl68qNTUVKsfAPbVKsxfQX5eyq3gukmZVQNbhfkXZVgAAAB2Y3Ny9frrr+vvv/9WaGioOnbsqI4dOyosLEyJiYmaNm2a3QJLTEyUJFWrVs3qeLVq1SznrvXPP/8oIyPDpvtI0pQpU+Tn52f5CQkJuc7oAVzL3c2kCdHhkpQtwcq6PSE6nP2uAABAsWVzclW9enX9/vvvmjp1qsLDw9WiRQvNnDlTu3btKrZJSdZIXNZPTgU7AFy/qIggzR3YXIF+1lP/Av28KMMOAACKPdsWSf2/8uXL68EHH7R3LFYCAwMlSSdOnFBQ0L8fuE6cOKGmTZvmeJ8qVarI3d0925qwEydOWK6XE09PT3l6el5/0ADyFRURpC7hgdoan6yks+kK8MmcCsiIFQAAKO4KlVzt379fa9asUVJSksxms9W5F154wS6BhYWFKTAwUKtXr7YkU6mpqdqyZYtGjBiR4308PDzUokULrV69Wn379pUkmc1mrV69WqNGjbJLXACun7ubSZF1Kjs7DAAAALuyObmaP3++RowYoSpVqigwMFAm07/fNptMJpuSq7S0NB04cMByOz4+Xjt27JC/v79q1qypMWPGaPLkyapXr57CwsL0/PPPKzg42JI4SVKnTp3Ur18/S/I0duxYDRkyRC1btlSrVq00Y8YMnTt3Tvfdd5+tTxUAAAAACszm5Gry5Ml66aWX9PTTT1/3g2/btk0dO3a03B47dqwkaciQIVq0aJGeeuopnTt3Tg8++KDOnDmjdu3aKTY2Vl5e/67XOHjwoP755x/L7bvvvlsnT57UCy+8oMTERDVt2lSxsbHZilwAAAAAgD2ZDMMwbLmDr6+vduzYodq1azsqJqdLTU2Vn5+fUlJS5Ovr6+xwAAAAADiJLbmBzdUC//Of/2jlypWFDg4AAAAASiKbpwXWrVtXzz//vDZv3qzGjRurbNmyVudHjx5tt+AAAAAAoLiweVpgWFhY7hczmfTXX39dd1DOxrRAAAAAAJJtuYHNI1fx8fGFDgwAAAAASiqb11xluXTpkvbu3asrV67YMx4AAAAAKJZsTq7Onz+v4cOHq1y5cmrUqJGOHDkiSXr00Uf1yiuv2D1AAAAAACgObE6uxo8fr507d+qnn36y2m+qc+fO+uyzz+waHAAAAAAUFzavuVqyZIk+++wz3XzzzTKZTJbjjRo10sGDB+0aHAAAAAAUFzYnVydPnlRAQEC24+fOnbNKtgAgNxlmQ1vjk5V0Nl0BPl5qFeYvdzfePwAAQPFmc3LVsmVLffvtt3r00UclyZJQLViwQJGRkfaNDkCJE7s7QTHL4pSQkm45FuTnpQnR4YqKCHJiZAAAANfH5uTq5ZdfVvfu3RUXF6crV65o5syZiouL08aNG7V27VpHxAighIjdnaARi7fr2s31ElPSNWLxds0d2JwECwAAFFs2F7Ro166ddu7cqStXrqhx48ZauXKlAgICtGnTJrVo0cIRMQIoATLMhmKWxWVLrCRZjsUsi1OG2aZ9zQEAAFyGTSNXly9f1kMPPaTnn39e8+fPd1RMAEqgrfHJVlMBr2VISkhJ19b4ZEXWqVx0gQEAANiJTSNXZcuW1f/+9z9HxQKgBEs6m3tiVZh2AAAArsbmaYF9+/bVkiVLHBAKgJIswMcr/0Y2tAMAAHA1Nhe0qFevnl588UX9/PPPatGihcqXL291fvTo0XYLDkDJ0SrMX0F+XkpMSc9x3ZVJUqBfZll2AACA4shkGIZNq8fDwsJyv5jJpL/++uu6g3K21NRU+fn5KSUlRb6+vs4OB7ALV9hbKqtaoCSrBCsrCqoFAgAAV2NLbmDzyFV8fHyhAwPgHK6yt1RURJDmDmyeLZZA9rkCAAAlgM0jV1kuXbqk+Ph41alTR2XK2JyjuTRGrlCS5La3lDNHi1xhFA0AAKAgbMkNbC5ocf78eQ0fPlzlypVTo0aNdOTIEUnSo48+qldeeaVwEQNwCFfdW8rdzaTIOpXVp2l1RdapTGIFAABKBJuTq/Hjx2vnzp366aef5OX1b1Wvzp0767PPPrNrcACujy17SwEAAOD62Dyfb8mSJfrss8908803y2T699vmRo0a6eDBg3YNDsD1YW8pAACAomPzyNXJkycVEBCQ7fi5c+eski0AzsfeUgAAAEXH5uSqZcuW+vbbby23sxKqBQsWKDIy0n6RAbhuWXtL5fa1h0mZVQPZWwoAAOD62Twt8OWXX1b37t0VFxenK1euaObMmYqLi9PGjRu1du1aR8QIoJDc3UyaEB2uEYu3y6Sc95aaEB1OQQkAAAA7sHnkql27dtqxY4euXLmixo0ba+XKlQoICNCmTZvUokULR8QI4Dpk7S1VzdfT6ng1X0827QUAALCjAo1cjR07VpMmTVL58uW1bt06tWnTRvPnz3d0bADs6trRKUarAAAA7KlAmwiXLVtWR48eVbVq1eTu7q6EhIQci1qUFGwijJLEFTcRBgAAKC5syQ0KNHJVq1Ytvfnmm+ratasMw9CmTZtUqVKlHNvecssttkcMwCHy20TYpMxNhLuEB7LuCgAA4DoVaORqyZIlevjhh5WUlCSTyaTc7mIymZSRkWH3IIsaI1coKTYdPKX+8zfn2+6TB25WZJ3KRRARAABA8WL3kau+ffuqb9++SktLk6+vr/bu3VuipwUCJQWbCAMAABQdm0qxV6hQQWvWrFFYWJjKlLG5ijuAIsYmwgAAAEXH5gypQ4cOMpvN2rdvn5KSkmQ2m63Os+YKcB1ZmwgnpqTnuO7KJCmQTYQBAADswubkavPmzbr33nt1+PDhbGuvSsqaK6CkYBNhAACAomPzJsIPP/ywWrZsqd27dys5OVmnT5+2/CQnJzsiRqDUyzAb2nTwlL7ZcUybDp5ShjnfOjQWWZsIB/pZT/0L9POiDDsAAIAdFaha4NXKly+vnTt3qm7duo6KyemoFghXErs7QTHL4pSQ8m/RiSA/L02IDrcpMcowG9oan6yks+kK8MmcCsiIFQAAQN5syQ1sHrlq3bq1Dhw4UOjgABRc1gbAVydWkpSYkq4Ri7crdndCga/l7mZSZJ3K6tO0uiLrVCaxAgAAsDOb11w9+uijGjdunBITE9W4cWOVLVvW6vyNN95ot+CA0owNgAEAAIoXm5OrO+64Q5I0bNgwy7GsjYUpaAHYz9b45GwjVlczJCWkpGtrfDIbAAMAALgAm5Or+Ph4R8QB4BpsAAwAAFC82JxchYaGOiIOANdgA2AXYc6QDm+U0k5IFapJoW0kN3dnRwUAAFxQgZOrpUuXFqhd7969Cx0MgH+xAbALiFsqxT4tpR7/95hvsBT1qhTOex0AALBW4FLsbm75FxYsKWuuKMUOV5FVLVDKeQNg9qlyoLil0ueDpWyp7f/3/l0fkGABAFAKOKQUu9lszvenJCRWgCthA2AnMWdkjljlWqtRUuwzme0AAKWTOUOKXy/t+jLzv/xNgAqx5gpA0YqKCFKX8EA2AC5KhzdaTwXMxpBSj2W2C2tfZGEBAFwE08aRC5IroBjI2gAYRSTthH3bAQBKjtymjacmZB5n2nipVuBpgQBcV4bZ0KaDp/TNjmPadPCUMswFWkqJ3FSoZt92AICSgWnjyAcjV4CLuXTFrA83HdLh5PMK9S+nQZG15FEm9+9BYncnKGZZnNWGw0F+XpoQHc6arMIKbZM5vSM1QTn/ATVlng9tU9SRAQCciWnjyAfJFeBCpnwXp/nr43X1wNNL3+3RA+3DNL5HeLb2WdUEr/34n5iSrhGLt1P0orDc3DPnzX8+WJnVAXOo1Rj1CvtdAUBpw7Rx5KNQ0wLPnDmjBQsWaPz48UpOTpYkbd++XceOHbNrcEBpMuW7OL29zjqxkiSzIb29Ll5TvouzOp5hNhSzLC6viQmKWRbHFMHCCu+dOW/e95rk1DeY+fQAUFoxbRz5sHnk6vfff1fnzp3l5+enQ4cO6YEHHpC/v7+++uorHTlyRB988IEj4gRKtEtXzJq/Pj7PNvPXx2tc1waWKYJb45OtpgJey5CUkJKurfHJFMMorPDeUoOemdM70k5k/rEMbcOIFQCUVkwbRz5sHrkaO3ashg4dqv3798vL69+9d3r06KF169bZNTigtPhw06FsI1bXMhuZ7bIknc09sbpaQdshF27umfPmG9+Z+V8SKwAovbKmjUuyTBO3YNo4CpFc/fLLL3rooYeyHa9evboSExPtEtTVatWqJZPJlO1n5MiRObZftGhRtrZXJ4GAKzqcfN7mdgE+BXtdF7QdcF3YTBNAacG0ceTB5mmBnp6eSk1NzXZ83759qlq1ql2Cutovv/yijIx//0jv3r1bXbp00X/+859c7+Pr66u9e/dabptMbLYK1xbqX87mdq3C/BXk56XElPTcJiYo0C9zw2HAodhME0Bpw7Rx5MLmkavevXvrxRdf1OXLlyVlJi5HjhzR008/rTvuuMPuAVatWlWBgYGWn+XLl6tOnTrq0KFDrvcxmUxW96lWjUWFyJ8z94oaFFlLbvl8B+BmymyXxd3NpAnRmRUEc5mYoAnR4XLP78LA9cjaTPPa0sRZm2nGLXVOXADgaEwbRw5sTq6mTZumtLQ0BQQE6MKFC+rQoYPq1q0rHx8fvfTSS46I0eLSpUtavHixhg0bludoVFpamkJDQxUSEqI+ffrojz/+yPO6Fy9eVGpqqtUPSpfY3Qlq9+qP6j9/sx77dIf6z9+sdq/+qNjdCUXy+B5l3PRA+7A82zzQPizbfldREUGaO7C5Av2sp/4F+nlRhh2Ox2aaAABYMRmGUaiv5zds2KDff/9daWlpat68uTp37mzv2LL5/PPPde+99+rIkSMKDg7Osc2mTZu0f/9+3XjjjUpJSdHrr7+udevW6Y8//lCNGjVyvM/EiRMVExOT7XhKSop8fX3t+hzgenLbKyorfS/KJCWnfa7cTMp1n6ssGWZDW+OTlXQ2XQE+mVMBGbGCw8Wvl97vlX+7IcvZTBMAUGylpqbKz8+vQLlBoZMrZ+jWrZs8PDy0bNmyAt/n8uXLatiwofr3769Jkybl2ObixYu6ePGi5XZqaqpCQkJIrkqBDLOhdq/+mGtJ86x1Sxuevq3IkpVLV8z6cNMhHU4+r1D/choUWSvbiBXgEnZ9Kf1veP7t7ng3c9oMAADFkC3Jlc0FLd58880cj2dV5atbt65uueUWubvbd97p4cOHtWrVKn311Vc23a9s2bJq1qyZDhw4kGsbT09PeXp6Xm+IKIZcca8ojzJuGt6+dpE8FnBd2EwTAOAI5oxiWyzE5uRq+vTpOnnypM6fP69KlSpJkk6fPq1y5cqpQoUKSkpKUu3atbVmzRqFhITYLdCFCxcqICBAPXv2tOl+GRkZ2rVrl3r06GG3WFBysFcUcB3YTBMAYG/FvAKtzXONXn75Zd10003av3+/Tp06pVOnTmnfvn1q3bq1Zs6cqSNHjigwMFCPP/643YI0m81auHChhgwZojJlrPPBwYMHa/z48ZbbL774olauXKm//vpL27dv18CBA3X48GHdf//9dosHJQd7RQHXgc00AQD2VAIq0NqcXD333HOaPn266tSpYzlWt25dvf766xo/frxq1KihqVOn6ueff7ZbkKtWrdKRI0c0bNiwbOeOHDmihIR/K7qdPn1aDzzwgBo2bKgePXooNTVVGzduVHh47sUAUHpl7RWV22oqk6Qg9ooCcsdmmgAAeyghFWhtLmhRrlw5rVu3Ti1btrQ6/ssvv6hDhw46f/68Dh06pIiICKWlpdk12KJiy6I1FH9Z1QIl619nZ1QLBIqtYjw/HgDgAly4Aq0tuYHNI1cdO3bUQw89pN9++81y7LffftOIESN02223SZJ27dqlsLC89+wBXAV7RQF2wGaaAIDrkXbCvu2cxOaCFu+++64GDRqkFi1aqGzZspKkK1euqFOnTnr33XclSRUqVNC0adPsGyngQFERQeoSHsheUQAAAM5QQirQFnqfqz///FP79u2TJNWvX1/169e3a2DOxLRAAAAAoAiZM6QZEflXoB2zq8hnRzh0n6ssDRo0UIMGDQp7dwAAAADIlFWB9vPBylz5nsNK+GJQgbZQydXRo0e1dOlSHTlyRJcuXbI698Ybb9glMAAAAAClSFYF2hz3uXqlWFSgtTm5Wr16tXr37q3atWvrzz//VEREhA4dOiTDMNS8eXNHxAgAAACgNAjvLTXoWWwr0NqcXI0fP15PPPGEYmJi5OPjo//9738KCAjQgAEDFBUV5YgYAQCORBl1AChZivv7elYF2mLI5uRqz549+uSTTzLvXKaMLly4oAoVKujFF19Unz59NGLECLsHCQBwkLiluUy/eLVYTL8AAFyD93Wnsnmfq/Lly1vWWQUFBengwYOWc//884/9IgMAOFbc0syFw1f/AZYyKzV9PjjzPACg+OB93elsTq5uvvlmbdiwQZLUo0cPjRs3Ti+99JKGDRumm2++2e4BAgAcwJyR+c1mjuVu//9Y7DOZ7YCSwpwhxa+Xdn2Z+V9e3yhJeF93CTZPC3zjjTeUlpYmSYqJiVFaWpo+++wz1atXj0qBAFBcHN6Y/ZtNK4aUeiyzXTGd9w5YYaoUSjre112CTclVRkaGjh49qhtvvFFS5hTBefPmOSQwAIADpZ2wbzvAlWVNlbr2G/2sqVJ3fUCCheKP93WXYNO0QHd3d3Xt2lWnT592VDwAUDo4e3pShWr2bQe4KqZKobTgfd0l2DwtMCIiQn/99ZfCwsIcEQ+A0qi4l4y1lStMTwptk/mYqQnK+UOnKfN8aJuiiQdwFKZKobTgfd0l2FzQYvLkyXriiSe0fPlyJSQkKDU11eoHAGwSt1SaESG930v63/DM/86IKLkVjVylkpObe2YyJ0kyXXPy/29HvVI8klxnjwLCtTFVCqVFSXpfL8ZMhmHklNrmys3t33zMZPr3H84wDJlMJmVkFP8/aqmpqfLz81NKSop8fX2dHQ5QcuW2DiLrj0BJWwdhzshMHHP9Fv3/v1Ucs6vo/vjlOIpWPfMPcHHoe1cYBYRri1+f+aVNfoYsZ+QKJUNxf193QbbkBjZPC1yzZk2hAwMAi3zXQZgy10E06FlyvmVzxelJ4b0z+7i4TMu8egrpqYPST1NEkQLkialSKG2K2/t6CWNzctWhQwdHxAGgtHHFRMPRXHV6kpt78ejjnL6NzVEJTc5tUdrWMeYla6rU54OVOSp+dYLFVCmUUMXlfb0EsnnNlSStX79eAwcOVJs2bXTs2DFJ0ocffmjZXBgA8uWqiYYjUcmp8HJbq5arq5Lz0qa0rWMsiPDemSOZvkHWx32DGeEEYFc2J1f/+9//1K1bN3l7e2v79u26ePGiJCklJUUvv/yy3QMEUEKVxkQja3pStoXGWUyZ8+KZnmQtzymk+ShJyXlBuErBFFcU3lsasztzbdUd72b+d8wuEisAdlWoaoHz5s3T/PnzVbZsWcvxtm3bavv27XYNDkAJVhoTDSo5FU6+U0jzUJKS8/ywn1P+sqZKNb4z87/8rgGwM5uTq7179+qWW27JdtzPz09nzpyxR0wASoPSmmgwPcl2hRp9KoHJeX5sWccIAHAImwtaBAYG6sCBA6pVq5bV8Q0bNqh27dr2igtAaZCVaORYSrsEl4ylkpNtbB59KsHJeV5K4zpGFG8UXkEJZHNy9cADD+ixxx7Te++9J5PJpOPHj2vTpk164okn9PzzzzsiRgAlWWlNNKjkVHD5ltK+RklPznNTGtcxovhijzqUUDYnV88884zMZrM6deqk8+fP65ZbbpGnp6eeeOIJPfroo46IEUBJR6KBvORbStuQbv2vVLlO6UnOc8J+TigucttAnj3qUAKYDMMoRPkl6dKlSzpw4IDS0tIUHh6uChUq2Ds2p7FlF2YAQBHJ8Zvu6qVzlCo3lg+tUo77OfGhFc5mzsjcGiDX9YH//yXAmF2l80sSuCRbcgObk6vFixfr9ttvV7ly5a4rSFdGcgUALoo1GvkjCYUri1+fufdafoYsZ0YDXIYtuYHN0wIff/xxPfzww+rdu7cGDhyobt26yd2dP2wAgCLAFNL8ldZ1jCgeKLyCEs7mUuwJCQn69NNPZTKZdNdddykoKEgjR47Uxo2UdgUAwCWwnxNcFYVXUMLZnFyVKVNGvXr10kcffaSkpCRNnz5dhw4dUseOHVWnTh1HxAjAHswZmdMxdn2Z+d/SvJEoAMA5SuMG8ihVbJ4WeLVy5cqpW7duOn36tA4fPqw9e/bYKy4A9kTJWwCAK8i3+qdK3x51KFFsHrmSpPPnz+ujjz5Sjx49VL16dc2YMUP9+vXTH3/8Ye/4AFyvrOph11Zmyip5G7fUOXEBAEqnrA3kfYOsj/sGU9ESxZ7N1QLvueceLV++XOXKldNdd92lAQMGKDIy0lHxOQXVAlFiUPIWAOCqqP6JYsKh1QLd3d31+eef51glcPfu3YqIiLD1kgAc5fDGPBIrSTKk1GOZ7ajABgAoSlT/RAlkc3L10UcfWd0+e/asPvnkEy1YsEC//vqrMjJYJA+4DEreAgAAFJlCrbmSpHXr1mnIkCEKCgrS66+/rttuu02bN2+2Z2wArhclbwEAAIqMTSNXiYmJWrRokd59912lpqbqrrvu0sWLF7VkyRKFh4c7KkYAhZVV8jY1QdYVmbL8/5orSt4CAABctwKPXEVHR6t+/fr6/fffNWPGDB0/flxvvfWWI2MDcL2ySt5Kyr6nCCVvAQAA7KnAydWKFSs0fPhwxcTEqGfPntmKWQBwUZS8BQAAKBIFnha4YcMGvfvuu2rRooUaNmyoQYMG6Z577nFkbADsJby31KAnJW8BAAAcqMAjVzfffLPmz5+vhIQEPfTQQ/r0008VHBwss9msH374QWfPnnVknACuV1bJ28Z3Zv6XxAoAAMCubN5E+Gp79+7Vu+++qw8//FBnzpxRly5dtHTpUnvG5xRsIgwAAABAsi03KHQpdkmqX7++pk6dqqNHj+qTTz65nksBAADAFZgzpPj10q4vM/9rZg9ToKCua+SqpGLkCgAAlEpxS6XYp6XU4/8e8w3OrDxLAaSSxZzBWuwCsiU3sGmfKwAAAJRQcUulzwcr276IqQmZx6kwW3KQRDvMdU0LBAAAQAlgzsj8sJ3jhvP/fyz2GaYIlgRZSfTViZX0bxIdV/zrJzgTyRWAosd8fgBwLYc3Zv+wbcWQUo9ltkPxRRLtcEwLBFC0mIoAAK4n7YR928E12ZJEh7UvsrBKEkauABQdpiIAgGuqUM2+7eCaSKIdjuQKQNFgKgIAuK7QNpmzCGTKpYFJ8q2e2Q7FF0m0w5FcASgazOcHANfl5p45PVtS9gTr/29HvUKp7uKOJNrhSK4AFA2mIgCAawvvnVlu3TfI+rhvMGXYSwqSaIdz6eRq4sSJMplMVj8NGjTI8z5ffPGFGjRoIC8vLzVu3FjfffddEUULIE9MRQAA1xfeWxqzWxqyXLrj3cz/jtlFYlWSkEQ7lMtXC2zUqJFWrVpluV2mTO4hb9y4Uf3799eUKVPUq1cvffzxx+rbt6+2b9+uiIiIoggXQG6ypiKkJijndVemzPNMRQAA53Jzp1JcSRfeW2rQM3MqftqJzC82Q9swYmUHJsMwcvqU4xImTpyoJUuWaMeOHQVqf/fdd+vcuXNavny55djNN9+spk2bat68eQV+3NTUVPn5+SklJUW+vr62hg0gN1nVAiVZJ1j/PxWBb8wAAICLsSU3cOlpgZK0f/9+BQcHq3bt2howYICOHDmSa9tNmzapc+fOVse6deumTZs25fkYFy9eVGpqqtUPAAdgKgIAACjBXHpaYOvWrbVo0SLVr19fCQkJiomJUfv27bV79275+Phka5+YmKhq1azXa1SrVk2JiYl5Ps6UKVMUExNj19gB5IKpCAAAoIRy6eSqe/fulv+/8cYb1bp1a4WGhurzzz/X8OHD7fY448eP19ixYy23U1NTFRISYrfrA7gG8/kBAEAJ5NLJ1bUqVqyoG264QQcOHMjxfGBgoE6csC7jfOLECQUGBuZ5XU9PT3l6etotTgAAAAClj8uvubpaWlqaDh48qKCgoBzPR0ZGavXq1VbHfvjhB0VGRhZFeAAAAABKMZdOrp544gmtXbtWhw4d0saNG9WvXz+5u7urf//+kqTBgwdr/PjxlvaPPfaYYmNjNW3aNP3555+aOHGitm3bplGjRjnrKQAAAAAoJVx6WuDRo0fVv39/nTp1SlWrVlW7du20efNmVa1aVZJ05MgRubn9mx+2adNGH3/8sZ577jn997//Vb169bRkyRL2uAIAAADgcC69z5WzsM8VCirDbGhrfLKSzqYrwMdLrcL85e5mcnZYAAAAsBNbcgOXHrkCXFns7gTFLItTQkq65ViQn5cmRIcrKiLndYEAAAAouVx6zRXgqmJ3J2jE4u1WiZUkJaaka8Ti7YrdneCkyAAApZY5Q4pfL+36MvO/5gxnRwSUOoxcATbKMBuKWRannObTGpJMkmKWxalLeCBTBAEARSNuqRT7tJR6/N9jvsFS1KuZm7cDKBKMXAE22hqfnG3E6mqGpISUdG2NTy66oAAApVfcUunzwdaJlSSlJmQej1vqnLiAUojkCrBR0tncE6vCtCsyTBcBgJLHnJE5YpXrfApJsc/wng8UEaYFAjYK8PGya7siwXQRACiZDm/MPmJlxZBSj2W2C2tfZGEBpRUjV4CNWoX5K8jPS7mtpjIps2pgqzD/ogwrd644XYRRNACwj7QT9m0H4LowcuXC2EPJNbm7mTQhOlwjFm+XSdYTMbL+dSZEh7vGv1W+00VMmdNFGvSU3NyLJiZG0QDAfipUs287ANeF5MpFsYeSa4uKCNLcgc2z/RsFutq/katNF8kaRbs22csaRbvrAxIsALBFaJvML6hSE5TzF2mmzPOhbYo6MhQ35ozMzwNpJzKT8dA2RffFawlCcuWCsvZQuvYtMmsPpbkDm7vOh/dSLCoiSF3CA117dNGVpou44igaABR3bu6ZI/+fD5Zym08R9Qrvq8gbs0rshjVXLia/PZSkzD2UMsw5tUBRc3czKbJOZfVpWl2RdSq7VmIludZ0EVtG0QAABRfeO3Pk3/eaL159g5kRgPy54trsYoyRKxdjyx5KkXUqF11gKJ5cabqIK42iAUBJE947c+SfaV2wBbNK7I6RKxdTbPdQgmvKmi6S45umMo8X1XQRVxpFA4CSyM09c/1s4zsz/8uHYeSHWSV2R3LlYorlHkpAQWSNouVVxN63OouuAQAoKswqsTuSKxdT7PZQQtEo7L5QluH+3Pz/cH9R7DNlGUX7/8e9Ng6JRdcAABRGYT8nMKvE7lhz5WKK1R5KKBrXU8HH1UqxZy26zvH5vMKiawAAbHU9nxNcaW12CcHIlQvK2kMp0M966l+gnxdl2Eub663g44rD/eG9pTG7pSHLpTvezfzvmF0kVgAA2Op6Pycwq8TuGLlyUcViDyU4lj0q+LjqcH/WomsAAFA49qr0x6wSuyK5cmFZeyihlLLHlD6G+wEAKJnsOfWfUv52Q3IFuCp7TOnLGu7/fLCU2yo+hvsBACh+7D31n1kldsGaK8BV2WtKX9Zwv+81a/V8gzOPM9wPAEDx46pT/0s5Rq6ALOYM1xoOt+eUPob7AQAoWZj675JIrgCp8GVMHZmQ2XtKH8P9AACUHEz9d0kmwzBySnVLtdTUVPn5+SklJUW+vr7ODgeOllXGNNu3Pv//xpTb1Lnr2VciL9cmbOdPSd+Pv+ZxqlPBBwAA5PJ5hM8J9mRLbkBylQOSq1LEnCHNiMij2s7/D6mP2WX9zU9hE7L8xC2VVjwlnU3495hPkNTtFal85dxHyFxtSiMAACg6fA5wKFtyA6YFonQrTBlTe+0rca24pdLng7IfP5sgfTlEuutDqfGdOd/PESNoAACgeGDqv8ugWiBKt8KUMbUlISsoc4a0bHTebZY9ltnuate7MzsAAADshuQKpVthypjae18JSYpfL104nXebC8mZ7bLkO4KmzBG0axMyAAAAOATJFUq3rDKmWWulsjFlLgq9uoypI/aVOLzB9naOGEEDAABAoZFcoXTLKmMqKXuClUsZ08IkZPkpaFmZq9s5YgQNAAAAhUZyBYT3zqzu5xtkfdw3OOeqf4VJyPJT0EWoV7djZ3YAAACXQrVAQMpMoBr0LHgZ06yELMcqfYXYV6JWO8nbP3NdVW68/TPbZWFndgAAAJdCcgVksbWMqa0JWX6PHT0z51LsWaJnWl+bndkBAABcCtMCgeuRlZA1vjPzv9eTyIT3ztzLyjfY+rhv9czjOY2G2TqlEQAAAA5jMgyjoEvpSw1bdmEG7K4wu6yzMzsAAIBD2JIbMC0QcDWF2WWdndkBAACcjmmBAAAAAGAHJFcAAAAAYAckVwAAAABgByRXAAAAAGAHJFcAAAAAYAckVwAAAABgByRXAAAAAGAHJFcAAAAAYAckVwAAAABgByRXAAAAAGAHJFcAAAAAYAckVwAAAABgByRXAAAAAGAHJFcAAAAAYAckVwAAAABgByRXAAAAAGAHLp1cTZkyRTfddJN8fHwUEBCgvn37au/evXneZ9GiRTKZTFY/Xl5eRRQxAAAAgNLKpZOrtWvXauTIkdq8ebN++OEHXb58WV27dtW5c+fyvJ+vr68SEhIsP4cPHy6iiAEAAACUVmWcHUBeYmNjrW4vWrRIAQEB+vXXX3XLLbfkej+TyaTAwEBHhwcAAAAAFi49cnWtlJQUSZK/v3+e7dLS0hQaGqqQkBD16dNHf/zxR57tL168qNTUVKsfAAAAALBFsUmuzGazxowZo7Zt2yoiIiLXdvXr19d7772nb775RosXL5bZbFabNm109OjRXO8zZcoU+fn5WX5CQkIc8RQAAAAAlGAmwzAMZwdRECNGjNCKFSu0YcMG1ahRo8D3u3z5sho2bKj+/ftr0qRJOba5ePGiLl68aLmdmpqqkJAQpaSkyNfX97pjBwAAAFA8paamys/Pr0C5gUuvucoyatQoLV++XOvWrbMpsZKksmXLqlmzZjpw4ECubTw9PeXp6Xm9YQIAAAAoxVx6WqBhGBo1apS+/vpr/fjjjwoLC7P5GhkZGdq1a5eCgoIcECEAAAAAZHLpkauRI0fq448/1jfffCMfHx8lJiZKkvz8/OTt7S1JGjx4sKpXr64pU6ZIkl588UXdfPPNqlu3rs6cOaPXXntNhw8f1v333++05wEAAACg5HPp5Gru3LmSpFtvvdXq+MKFCzV06FBJ0pEjR+Tm9u8A3OnTp/XAAw8oMTFRlSpVUosWLbRx40aFh4cXVdgAAAAASqFiU9CiKNmyaA0AAABAyWVLbuDSa64AAAAAoLgguQIAAAAAO3DpNVelnjlDOrxRSjshVagmhbaR3NydHRUAAACAHJBcuaq4pVLs01Lq8X+P+QZLUa9K4b2dFxcAAACAHDEt0BXFLZU+H2ydWElSakLm8bilzokLAAAAQK5IrlyNOSNzxEo5FXH8/2Oxz2S2AwAAAOAySK5czeGN2UesrBhS6rHMdgAAAABcBsmVq0k7Yd92AAAAAIoEyZWrqVDNvu0AAAAAFAmSK1cT2iazKqBMuTQwSb7VM9sBAAAAcBkkV67GzT2z3Lqk7AnW/9+OeoX9rgAAAAAXQ3LlisJ7S3d9IPkGWR/3Dc48zj5XAAAAgMthE2FXFd5batAzsypg2onMNVahbRixAgAAAFwUyZUrc3OXwto7OwoAAAAABcC0QAAAAACwA5IrAAAAALADkisAAAAAsAOSKwAAAACwA5IrAAAAALADkisAAAAAsAOSKwAAAACwA5IrAAAAALADkisAAAAAsAOSKwAAAACwA5IrAAAAALADkisAAAAAsAOSKwAAAACwgzLODsAVGYYhSUpNTXVyJAAAAACcKSsnyMoR8kJylYOzZ89KkkJCQpwcCQAAAABXcPbsWfn5+eXZxmQUJAUrZcxms44fPy4fHx+ZTCZnh+MQqampCgkJ0d9//y1fX19nh1Oi0LeOQ986Dn3rOPSt49C3jkPfOg596ziO6lvDMHT27FkFBwfLzS3vVVWMXOXAzc1NNWrUcHYYRcLX15dfbAehbx2HvnUc+tZx6FvHoW8dh751HPrWcRzRt/mNWGWhoAUAAAAA2AHJFQAAAADYAclVKeXp6akJEybI09PT2aGUOPSt49C3jkPfOg596zj0rePQt45D3zqOK/QtBS0AAAAAwA4YuQIAAAAAOyC5AgAAAAA7ILkCAAAAADsguQIAAAAAOyC5KuGOHTumgQMHqnLlyvL29lbjxo21bds2y3nDMPTCCy8oKChI3t7e6ty5s/bv3+/EiIuHjIwMPf/88woLC5O3t7fq1KmjSZMm6er6MPRtwaxbt07R0dEKDg6WyWTSkiVLrM4XpB+Tk5M1YMAA+fr6qmLFiho+fLjS0tKK8Fm4prz69vLly3r66afVuHFjlS9fXsHBwRo8eLCOHz9udQ36Nmf5vW6v9vDDD8tkMmnGjBlWx+nbnBWkb/fs2aPevXvLz89P5cuX10033aQjR45Yzqenp2vkyJGqXLmyKlSooDvuuEMnTpwowmfhmvLr27S0NI0aNUo1atSQt7e3wsPDNW/ePKs29G3OpkyZoptuukk+Pj4KCAhQ3759tXfvXqs2Bem7I0eOqGfPnipXrpwCAgL05JNP6sqVK0X5VFxOfn2bnJysRx99VPXr15e3t7dq1qyp0aNHKyUlxeo6RdW3JFcl2OnTp9W2bVuVLVtWK1asUFxcnKZNm6ZKlSpZ2kydOlVvvvmm5s2bpy1btqh8+fLq1q2b0tPTnRi563v11Vc1d+5czZo1S3v27NGrr76qqVOn6q233rK0oW8L5ty5c2rSpIlmz56d4/mC9OOAAQP0xx9/6IcfftDy5cu1bt06Pfjgg0X1FFxWXn17/vx5bd++Xc8//7y2b9+ur776Snv37lXv3r2t2tG3OcvvdZvl66+/1ubNmxUcHJztHH2bs/z69uDBg2rXrp0aNGign376Sb///ruef/55eXl5Wdo8/vjjWrZsmb744gutXbtWx48f1+23315UT8Fl5de3Y8eOVWxsrBYvXqw9e/ZozJgxGjVqlJYuXWppQ9/mbO3atRo5cqQ2b96sH374QZcvX1bXrl117tw5S5v8+i4jI0M9e/bUpUuXtHHjRr3//vtatGiRXnjhBWc8JZeRX98eP35cx48f1+uvv67du3dr0aJFio2N1fDhwy3XKNK+NVBiPf3000a7du1yPW82m43AwEDjtddesxw7c+aM4enpaXzyySdFEWKx1bNnT2PYsGFWx26//XZjwIABhmHQt4Ulyfj6668ttwvSj3FxcYYk45dffrG0WbFihWEymYxjx44VWeyu7tq+zcnWrVsNScbhw4cNw6BvCyq3vj169KhRvXp1Y/fu3UZoaKgxffp0yzn6tmBy6tu7777bGDhwYK73OXPmjFG2bFnjiy++sBzbs2ePIcnYtGmTo0ItdnLq20aNGhkvvvii1bHmzZsbzz77rGEY9K0tkpKSDEnG2rVrDcMoWN999913hpubm5GYmGhpM3fuXMPX19e4ePFi0T4BF3Zt3+bk888/Nzw8PIzLly8bhlG0fcvIVQm2dOlStWzZUv/5z38UEBCgZs2aaf78+Zbz8fHxSkxMVOfOnS3H/Pz81Lp1a23atMkZIRcbbdq00erVq7Vv3z5J0s6dO7VhwwZ1795dEn1rLwXpx02bNqlixYpq2bKlpU3nzp3l5uamLVu2FHnMxVlKSopMJpMqVqwoib69HmazWYMGDdKTTz6pRo0aZTtP3xaO2WzWt99+qxtuuEHdunVTQECAWrdubTW97ddff9Xly5et3jcaNGigmjVr8v6bjzZt2mjp0qU6duyYDMPQmjVrtG/fPnXt2lUSfWuLrClp/v7+kgrWd5s2bVLjxo1VrVo1S5tu3bopNTVVf/zxRxFG79qu7dvc2vj6+qpMmTKSirZvSa5KsL/++ktz585VvXr19P3332vEiBEaPXq03n//fUlSYmKiJFm90LJuZ51Dzp555hndc889atCggcqWLatmzZppzJgxGjBggCT61l4K0o+JiYkKCAiwOl+mTBn5+/vT1zZIT0/X008/rf79+8vX11cSfXs9Xn31VZUpU0ajR4/O8Tx9WzhJSUlKS0vTK6+8oqioKK1cuVL9+vXT7bffrrVr10rK7FsPDw/LlwRZeP/N31tvvaXw8HDVqFFDHh4eioqK0uzZs3XLLbdIom8Lymw2a8yYMWrbtq0iIiIkFazvEhMTc/x7l3UOOffttf755x9NmjTJapp1UfZtGbteDS7FbDarZcuWevnllyVJzZo10+7duzVv3jwNGTLEydEVb59//rk++ugjffzxx2rUqJF27NihMWPGKDg4mL5FsXP58mXdddddMgxDc+fOdXY4xd6vv/6qmTNnavv27TKZTM4Op0Qxm82SpD59+ujxxx+XJDVt2lQbN27UvHnz1KFDB2eGV+y99dZb2rx5s5YuXarQ0FCtW7dOI0eOVHBwsNWIC/I2cuRI7d69Wxs2bHB2KCVOfn2bmpqqnj17Kjw8XBMnTiza4P4fI1clWFBQkMLDw62ONWzY0FJRKTAwUJKyVao5ceKE5Rxy9uSTT1pGrxo3bqxBgwbp8ccf15QpUyTRt/ZSkH4MDAxUUlKS1fkrV64oOTmZvi6ArMTq8OHD+uGHHyyjVhJ9W1jr169XUlKSatasqTJlyqhMmTI6fPiwxo0bp1q1akmibwurSpUqKlOmTL5/2y5duqQzZ85YteH9N28XLlzQf//7X73xxhuKjo7WjTfeqFGjRunuu+/W66+/Lom+LYhRo0Zp+fLlWrNmjWrUqGE5XpC+CwwMzPHvXda50i63vs1y9uxZRUVFycfHR19//bXKli1rOVeUfUtyVYK1bds2WxnQffv2KTQ0VJIUFhamwMBArV692nI+NTVVW7ZsUWRkZJHGWtycP39ebm7Wvz7u7u6Wb1XpW/soSD9GRkbqzJkz+vXXXy1tfvzxR5nNZrVu3brIYy5OshKr/fv3a9WqVapcubLVefq2cAYNGqTff/9dO3bssPwEBwfrySef1Pfffy+Jvi0sDw8P3XTTTXn+bWvRooXKli1r9b6xd+9eHTlyhPffPFy+fFmXL1/O828bfZs7wzA0atQoff311/rxxx8VFhZmdb4gfRcZGaldu3ZZffGS9aXXtV8olCb59a2U+dmga9eu8vDw0NKlS62qh0pF3Ld2LY8Bl7J161ajTJkyxksvvWTs37/f+Oijj4xy5coZixcvtrR55ZVXjIoVKxrffPON8fvvvxt9+vQxwsLCjAsXLjgxctc3ZMgQo3r16sby5cuN+Ph446uvvjKqVKliPPXUU5Y29G3BnD171vjtt9+M3377zZBkvPHGG8Zvv/1mqVhXkH6MiooymjVrZmzZssXYsGGDUa9ePaN///7OekouI6++vXTpktG7d2+jRo0axo4dO4yEhATLz9WVk+jbnOX3ur3WtdUCDYO+zU1+ffvVV18ZZcuWNd555x1j//79xltvvWW4u7sb69evt1zj4YcfNmrWrGn8+OOPxrZt24zIyEgjMjLSWU/JZeTXtx06dDAaNWpkrFmzxvjrr7+MhQsXGl5eXsacOXMs16BvczZixAjDz8/P+Omnn6zeT8+fP29pk1/fXblyxYiIiDC6du1q7Nixw4iNjTWqVq1qjB8/3hlPyWXk17cpKSlG69atjcaNGxsHDhywanPlyhXDMIq2b0muSrhly5YZERERhqenp9GgQQPjnXfesTpvNpuN559/3qhWrZrh6elpdOrUydi7d6+Toi0+UlNTjccee8yoWbOm4eXlZdSuXdt49tlnrT6U0rcFs2bNGkNStp8hQ4YYhlGwfjx16pTRv39/o0KFCoavr69x3333GWfPnnXCs3EtefVtfHx8juckGWvWrLFcg77NWX6v22vllFzRtzkrSN++++67Rt26dQ0vLy+jSZMmxpIlS6yuceHCBeORRx4xKlWqZJQrV87o16+fkZCQUMTPxPXk17cJCQnG0KFDjeDgYMPLy8uoX7++MW3aNMNsNluuQd/mLLf304ULF1raFKTvDh06ZHTv3t3w9vY2qlSpYowbN85STry0yq9vc3tdSzLi4+Mt1ymqvjX9f9AAAAAAgOvAmisAAAAAsAOSKwAAAACwA5IrAAAAALADkisAAAAAsAOSKwAAAACwA5IrAAAAALADkisAAAAAsAOSKwAAAACwA5IrAIDT3XrrrRozZozldq1atTRjxgyHPuaiRYtUsWJFhz6GJD3//PN68MEHHfoY//zzjwICAnT06FGHPg4AIG8kVwCAIjF06FCZTKZsPwcOHNBXX32lSZMm2fXx1q5dq9tuu03+/v4qV66c6tWrpyFDhujSpUuSpLvvvlv79u2z62NeKzExUTNnztSzzz5rOZbVDw8//HC29iNHjpTJZNLQoUOztc/6qVy5sqKiovT7779b2lSpUkWDBw/WhAkTHPp8AAB5I7kCABSZqKgoJSQkWP2EhYXJ399fPj4+dnucuLg4RUVFqWXLllq3bp127dqlt956Sx4eHsrIyJAkeXt7KyAgwG6PmZMFCxaoTZs2Cg0NtToeEhKiTz/9VBcuXLAcS09P18cff6yaNWtmu87V/bZ69WqVKVNGvXr1smpz33336aOPPlJycrJjngwAIF8kVwCAIuPp6anAwECrH3d392zTAq915swZ3X///apatap8fX112223aefOnbm2X7lypQIDAzV16lRFRESoTp06ioqK0vz58+Xt7S0p+7TAWrVq5TiyluXvv//WXXfdpYoVK8rf3199+vTRoUOH8ny+n376qaKjo7Mdb968uUJCQvTVV19Zjn311VeqWbOmmjVrlq391f3WtGlTPfPMM/r777918uRJS5tGjRopODhYX3/9dZ4xAQAch+QKAODy/vOf/ygpKUkrVqzQr7/+qubNm6tTp065jtIEBgYqISFB69atK/Bj/PLLL5bRoaNHj+rmm29W+/btJUmXL19Wt27d5OPjo/Xr1+vnn39WhQoVFBUVZZlmeK3k5GTFxcWpZcuWOZ4fNmyYFi5caLn93nvv6b777ss3zrS0NC1evFh169ZV5cqVrc61atVK69evL+hTBgDYWRlnBwAAKD2WL1+uChUqWG53795dX3zxRZ732bBhw/+1dz8hUXVhHMd/d0awMi3Bf6MggSGCEFSQtklDMUSkGHRhppBoJKTiStKVJIHRtvIvuhEVXShElIsc1IRGI4T8s1DKFqYVuXDUgui2ctDmTvm+zPjGy/cDA/PnnPucc3fPPHOekdvt1sePHxUaGipJun//voaGhjQ4OGjZLKKwsFDPnj1TRkaG4uLilJ6erqysLJWWlioiIsIyTnR0tPd5TU2NPnz4oKmpKUlSf3+/fvz4oY6ODm81q6urS8ePH5fL5VJOTo7P9d6/fy/TNBUfH28Z79q1a7p9+7aWl5clSS9evFBfX59cLpfP2N33bXNzUw6HQ48fP5bNtvc70vj4eL1+/doyHgAg+EiuAAAH5uLFi3r06JH3dVhY2B/nzMzMyOPx+FRptre3tbS0ZDnHbrerq6tLTU1Nev78uV6+fKm7d++qublZbrdbDofDb7y2tjZ1dnZqcnLSm3DNzMxocXHR51zY169f/a5h5zzVoUOHLD+Pjo5WXl6euru7ZZqm8vLyFBUVZTl2931bX1/Xw4cPlZubK7fbvec81+HDh7W1teV3bwCA4CK5AgAcmLCwMJ08efIfzfF4PHI4HJYVnT+1Uk9ISFBJSYlKSkp0584dJScnq6WlRY2NjZbjR0dHVVVVpd7eXp06dWrPGs6ePauenh6fObsrXrvtJErr6+t+x5SVlenWrVuSpAcPHvjdx6/3raOjQ8eOHVN7e7uampq873/58sVvLABA8JFcAQD+amfOnNHq6qpCQkJ04sSJf32dyMhIORwObW5uWn6+uLiogoIC1dfXy+l0+qyhv79fMTExfn9W+KukpCRFRERobm5OycnJlmN2zmwZhqFLly7tey+GYchms+3pNihJb968UWZm5r6vAwAILBpaAAD+atnZ2Tp//ryuXLmikZERvXv3TpOTk2poaND09LTlnNbWVlVWVmpkZERLS0uanZ1VXV2dZmdnLbv3bW9vKz8/X6dPn9aNGze0urrqfUhScXGxoqKidPnyZY2Pj+vt27dyuVyqrq72+8e9NptN2dnZmpiY8Ls3u92u+fl5zc3NyW63+x337ds373rm5+dVVVUlj8ezZy9bW1t69eqV5fkvAMDBoHIFAPirGYahJ0+eqKGhQdevX9enT58UFxenCxcuKDY21nLOuXPnNDExoZs3b2plZUVHjx5VamqqhoaGlJGR4TN+bW1NCwsLWlhY8GlAYZqmjhw5orGxMdXV1cnpdGpjY0MJCQnKysr6bSWrvLxcFRUVunfvnk/ziR37qYQ9ffrUe04sPDxcKSkpGhgY2FOlGh4eVmJiorfDIQDg4BmmaZr/9SIAAPg/Mk1TaWlpqq2tVVFRUVBjpaenq7q6WlevXg1qHACAf/wsEACAIDEMQ21tbfr+/XtQ43z+/FlOpzPoCRwA4PeoXAEAAABAAFC5AgAAAIAAILkCAAAAgAAguQIAAACAACC5AgAAAIAAILkCAAAAgAAguQIAAACAACC5AgAAAIAAILkCAAAAgAAguQIAAACAAPgJ+Ly5X603zMIAAAAASUVORK5CYII=", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "plt.figure(figsize=(10, 6))\n", + "plt.scatter(all_df[\"file_size\"], all_df[\"original_time\"], label=\"Original Model\")\n", + "plt.scatter(all_df[\"file_size\"], all_df[\"trt_gds_gpu_transforms_time\"], label=\"Optimized Model\")\n", + "plt.xlabel(\"File Size (MB)\")\n", + "plt.ylabel(\"Average Inference Time (seconds)\")\n", + "plt.title(\"Comparison of original and most optimized model\")\n", + "plt.legend()\n", + "plt.show()" + ] } ], "metadata": { "kernelspec": { "display_name": "kvikio_env", "language": "python", - "name": "python3" + "name": "kvikio_env" }, "language_info": { "codemirror_mode": { @@ -522,7 +626,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.10.14" + "version": "3.10.16" } }, "nbformat": 4, diff --git a/acceleration/fast_inference_tutorial/run_benchmark.py b/acceleration/fast_inference_tutorial/run_benchmark.py index 0ec96df3d1..37901dc98d 100644 --- a/acceleration/fast_inference_tutorial/run_benchmark.py +++ b/acceleration/fast_inference_tutorial/run_benchmark.py @@ -49,6 +49,7 @@ def get_transforms(device, gpu_loading_flag=False, gpu_transforms_flag=False): return infer_transforms + def get_post_transforms(infer_transforms): post_transforms = Compose( [ @@ -65,6 +66,7 @@ def get_post_transforms(infer_transforms): ) return post_transforms + def get_model(device, weights_path, trt_model_path, trt_flag=False): if not trt_flag: model = SegResNet( @@ -84,11 +86,12 @@ def get_model(device, weights_path, trt_model_path, trt_flag=False): model = torch.jit.load(trt_model_path) return model + def run_inference(data_list, infer_transforms, model, device, benchmark_type): total_time_dict = {} roi_size = (96, 96, 96) - sw_batch_size = 1 - + sw_batch_size = 4 + for idx, sample in enumerate(data_list): start = timer() data = infer_transforms({"image": sample}) @@ -114,9 +117,10 @@ def run_inference(data_list, infer_transforms, model, device, benchmark_type): sample_name = sample.split("/")[-1] if idx > 0: total_time_dict[sample_name] = end - start - + print(f"Time taken for {sample_name}: {end - start} seconds") return total_time_dict + def main(): parser = argparse.ArgumentParser(description="Run inference benchmark.") parser.add_argument("--benchmark_type", type=str, default="original", help="Type of benchmark to run") @@ -128,8 +132,8 @@ def main(): torch_tensorrt.runtime.set_multi_device_safe_mode(True) device = torch.device("cuda:0") if torch.cuda.is_available() else torch.device("cpu") train_files = prepare_test_datalist(root_dir) - # since the dataset is too large, the smallest 21 files are used for warm up (1 file) and benchmarking (11 files) - train_files = sorted(train_files, key=lambda x: os.path.getsize(x), reverse=False)[:21] + # since the dataset is too large, the smallest 31 files are used for warm up (1 file) and benchmarking (30 files) + train_files = sorted(train_files, key=lambda x: os.path.getsize(x), reverse=False)[:31] weights_path = prepare_model_weights(root_dir=root_dir, bundle_name="wholeBody_ct_segmentation") trt_model_name = "model_trt.ts" trt_model_path = prepare_tensorrt_model(root_dir, weights_path, trt_model_name) @@ -146,5 +150,6 @@ def main(): df = pd.DataFrame(list(total_time_dict.items()), columns=["file_name", "time"]) df.to_csv(os.path.join(root_dir, f"time_{args.benchmark_type}.csv"), index=False) + if __name__ == "__main__": main() diff --git a/acceleration/fast_inference_tutorial/utils.py b/acceleration/fast_inference_tutorial/utils.py index ac14f55845..60486b7bf0 100644 --- a/acceleration/fast_inference_tutorial/utils.py +++ b/acceleration/fast_inference_tutorial/utils.py @@ -78,7 +78,7 @@ def prepare_tensorrt_model(root_dir, weights_path, trt_model_name="model_trt.ts" model=model, precision="fp16", input_shape=[1, 1, 96, 96, 96], - dynamic_batchsize=[1, 1, 1], + dynamic_batchsize=[1, 4, 4], use_trace=True, verify=False, ) diff --git a/runner.sh b/runner.sh index 07c9c07d7b..964c37b6d5 100755 --- a/runner.sh +++ b/runner.sh @@ -70,6 +70,7 @@ doesnt_contain_max_epochs=("${doesnt_contain_max_epochs[@]}" TCIA_PROSTATEx_Pros doesnt_contain_max_epochs=("${doesnt_contain_max_epochs[@]}" lazy_resampling_functional.ipynb) doesnt_contain_max_epochs=("${doesnt_contain_max_epochs[@]}" lazy_resampling_compose.ipynb) doesnt_contain_max_epochs=("${doesnt_contain_max_epochs[@]}" TensorRT_inference_acceleration.ipynb) +doesnt_contain_max_epochs=("${doesnt_contain_max_epochs[@]}" fast_inference_tutorial.ipynb) doesnt_contain_max_epochs=("${doesnt_contain_max_epochs[@]}" lazy_resampling_benchmark.ipynb) doesnt_contain_max_epochs=("${doesnt_contain_max_epochs[@]}" modular_patch_inferer.ipynb) doesnt_contain_max_epochs=("${doesnt_contain_max_epochs[@]}" GDS_dataset.ipynb) @@ -117,6 +118,7 @@ skip_run_papermill=("${skip_run_papermill[@]}" .*swinunetr_finetune*) skip_run_papermill=("${skip_run_papermill[@]}" .*active_learning*) skip_run_papermill=("${skip_run_papermill[@]}" .*transform_visualization*) # https://github.com/Project-MONAI/tutorials/issues/1155 skip_run_papermill=("${skip_run_papermill[@]}" .*TensorRT_inference_acceleration*) +skip_run_papermill=("${skip_run_papermill[@]}" .*fast_inference_tutorial*) skip_run_papermill=("${skip_run_papermill[@]}" .*mednist_classifier_ray*) # https://github.com/Project-MONAI/tutorials/issues/1307 skip_run_papermill=("${skip_run_papermill[@]}" .*TorchIO_MONAI_PyTorch_Lightning*) # https://github.com/Project-MONAI/tutorials/issues/1324 skip_run_papermill=("${skip_run_papermill[@]}" .*GDS_dataset*) # https://github.com/Project-MONAI/tutorials/issues/1324 From 82daa3264747ab23db52fc361f3ca2d13d390773 Mon Sep 17 00:00:00 2001 From: "pre-commit-ci[bot]" <66853113+pre-commit-ci[bot]@users.noreply.github.com> Date: Sat, 8 Mar 2025 03:57:49 +0000 Subject: [PATCH 08/11] [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci --- .../fast_inference_tutorial.ipynb | 24 ++++++++++++------- 1 file changed, 16 insertions(+), 8 deletions(-) diff --git a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb index 9ddc17d7dc..b75ba47cd4 100644 --- a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb +++ b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb @@ -370,7 +370,7 @@ " total_time_dict = {}\n", " roi_size = (96, 96, 96)\n", " sw_batch_size = 4\n", - " \n", + "\n", " for idx, sample in enumerate(data_list[:10]):\n", " start = timer()\n", " data = infer_transforms({\"image\": sample})\n", @@ -474,7 +474,9 @@ " all_df = pd.merge(all_df, df, on=\"file_name\", how=\"left\")\n", "\n", "# for each file, add it's size\n", - "all_df[\"file_size\"] = all_df[\"file_name\"].apply(lambda x: os.path.getsize(os.path.join(root_dir, \"Task03_Liver\", \"imagesTs_nii\", x)))\n", + "all_df[\"file_size\"] = all_df[\"file_name\"].apply(\n", + " lambda x: os.path.getsize(os.path.join(root_dir, \"Task03_Liver\", \"imagesTs_nii\", x))\n", + ")\n", "# sort by file size\n", "all_df = all_df.sort_values(by=\"file_size\", ascending=True)\n", "# convert file size to MB\n", @@ -538,8 +540,14 @@ ], "source": [ "print(\"TensorRT Improvement: \", (total_time[\"original_time\"] - total_time[\"trt_time\"]) / total_time[\"original_time\"])\n", - "print(\"TensorRT + GPU Transforms Improvement: \", (total_time[\"original_time\"] - total_time[\"trt_gpu_transforms_time\"]) / total_time[\"original_time\"])\n", - "print(\"TensorRT + GDS + GPU Transforms Improvement: \", (total_time[\"original_time\"] - total_time[\"trt_gds_gpu_transforms_time\"]) / total_time[\"original_time\"])" + "print(\n", + " \"TensorRT + GPU Transforms Improvement: \",\n", + " (total_time[\"original_time\"] - total_time[\"trt_gpu_transforms_time\"]) / total_time[\"original_time\"],\n", + ")\n", + "print(\n", + " \"TensorRT + GDS + GPU Transforms Improvement: \",\n", + " (total_time[\"original_time\"] - total_time[\"trt_gds_gpu_transforms_time\"]) / total_time[\"original_time\"],\n", + ")" ] }, { @@ -562,10 +570,10 @@ "total_time.index = [\"pytorch_model\", \"TensorRT\", \"TensorRT_GPU_Transform\", \"TensorRT_GPU_Transform_GDS\"]\n", "\n", "plt.figure(figsize=(10, 6))\n", - "total_time.plot(kind='bar', color=['skyblue', 'orange', 'green', 'red'])\n", - "plt.title('Total Inference Time for Each Benchmark Type')\n", - "plt.xlabel('Benchmark Type')\n", - "plt.ylabel('Total Time (seconds)')\n", + "total_time.plot(kind=\"bar\", color=[\"skyblue\", \"orange\", \"green\", \"red\"])\n", + "plt.title(\"Total Inference Time for Each Benchmark Type\")\n", + "plt.xlabel(\"Benchmark Type\")\n", + "plt.ylabel(\"Total Time (seconds)\")\n", "plt.xticks(rotation=45)\n", "plt.tight_layout()\n", "plt.show()" From 6d650ef59f07c9b9439166b71b13671db00d3a16 Mon Sep 17 00:00:00 2001 From: Yiheng Wang Date: Sat, 8 Mar 2025 04:23:45 +0000 Subject: [PATCH 09/11] fix pep8 Signed-off-by: Yiheng Wang --- .../fast_inference_tutorial.ipynb | 15 ++++----------- 1 file changed, 4 insertions(+), 11 deletions(-) diff --git a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb index b75ba47cd4..204c693ea2 100644 --- a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb +++ b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb @@ -42,13 +42,8 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Install environment" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ + "## Setup environment\n", + "\n", "Loading data directly from disk to GPU memory requires the `kvikio` library. In addition, this tutorial requires many other dependencies such as `monai`, `torch`, `torch_tensorrt`, `numpy`, `ignite`, `pandas`, `matplotlib`, etc. We recommend using the [MONAI Docker](https://docs.monai.io/en/latest/installation.html#from-dockerhub) image to run this tutorial, which includes pre-configured dependencies and allows you to skip manual installation.\n", "\n", "If not using MONAI Docker, install `kvikio` using one of these methods:\n", @@ -113,10 +108,10 @@ "from monai.inferers import sliding_window_inference\n", "from monai.networks.nets import SegResNet\n", "import matplotlib.pyplot as plt\n", - "import torch\n", "import gc\n", "import pandas as pd\n", "from timeit import default_timer as timer\n", + "from utils import prepare_test_datalist, prepare_model_weights, prepare_tensorrt_model\n", "\n", "print_config()" ] @@ -284,8 +279,6 @@ } ], "source": [ - "from utils import prepare_test_datalist, prepare_model_weights, prepare_tensorrt_model\n", - "\n", "root_dir = \".\"\n", "torch.backends.cudnn.benchmark = True\n", "torch_tensorrt.runtime.set_multi_device_safe_mode(True)\n", @@ -465,7 +458,7 @@ "outputs": [], "source": [ "# collect benchmark results\n", - "all_df = pd.read_csv(os.path.join(root_dir, f\"time_original.csv\"))\n", + "all_df = pd.read_csv(os.path.join(root_dir, \"time_original.csv\"))\n", "all_df.columns = [\"file_name\", \"original_time\"]\n", "\n", "for benchmark_type in [\"trt\", \"trt_gpu_transforms\", \"trt_gds_gpu_transforms\"]:\n", From bcdfe9584a57f181679c5c42f88140b3c1b67caf Mon Sep 17 00:00:00 2001 From: Yiheng Wang <68361391+yiheng-wang-nv@users.noreply.github.com> Date: Thu, 20 Mar 2025 11:41:34 +0800 Subject: [PATCH 10/11] Update acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb Co-authored-by: YunLiu <55491388+KumoLiu@users.noreply.github.com> Signed-off-by: Yiheng Wang <68361391+yiheng-wang-nv@users.noreply.github.com> --- .../fast_inference_tutorial/fast_inference_tutorial.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb index 204c693ea2..36a1aa70a4 100644 --- a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb +++ b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb @@ -129,7 +129,7 @@ "source": [ "### 1. TensorRT Inference\n", "\n", - "`monai.networks.utils.convert_to_trt` is a function that converts a PyTorch model to a TensorRT engine-based TorchScript model. Except the loading method (need to use `torch.jit.load` to load the model), the usage of the converted TorchScriptmodel is the same as the original model.\n", + "`monai.networks.utils.convert_to_trt` is a function that converts a PyTorch model to a TensorRT engine-based TorchScript model. Except the loading method (need to use `torch.jit.load` to load the model), the usage of the converted TorchScript model is the same as the original model.\n", "\n", "`monai.data.torchscript_utils.save_net_with_metadata` is a function that saves the converted TorchScript model and its metadata.\n", "\n", From 1dccf638537ff940593efdff9bbcab6bff951773 Mon Sep 17 00:00:00 2001 From: Yiheng Wang Date: Mon, 24 Mar 2025 08:26:02 +0000 Subject: [PATCH 11/11] update doc Signed-off-by: Yiheng Wang --- .../fast_inference_tutorial.ipynb | 45 ++++++++++++------- 1 file changed, 28 insertions(+), 17 deletions(-) diff --git a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb index 36a1aa70a4..ea0f398c7a 100644 --- a/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb +++ b/acceleration/fast_inference_tutorial/fast_inference_tutorial.ipynb @@ -203,7 +203,7 @@ "loader = LoadImaged(keys=\"image\", reader=\"NibabelReader\", to_gpu=True)\n", "```\n", "\n", - "Please note that only NIfTI (.nii, for compressed \".nii.gz\" files, this feature also supports but the acceleration is not significant) and DICOM (.dcm) files are supported for direct GPU data loading.\n" + "Please note that only NIfTI (`.nii`, for compressed `.nii.gz` files, this feature also supports but the acceleration is not guaranteed) and DICOM (`.dcm`) files are supported for direct GPU data loading.\n" ] }, { @@ -265,27 +265,15 @@ }, { "cell_type": "code", - "execution_count": 4, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Test data already exists at ./Task03_Liver/imagesTs_nii\n", - "Weights already exists at ./model.pt\n", - "TensorRT model already exists at ./model_trt.ts\n" - ] - } - ], + "outputs": [], "source": [ "root_dir = \".\"\n", "torch.backends.cudnn.benchmark = True\n", "torch_tensorrt.runtime.set_multi_device_safe_mode(True)\n", "device = torch.device(\"cuda:0\") if torch.cuda.is_available() else torch.device(\"cpu\")\n", "train_files = prepare_test_datalist(root_dir)\n", - "# since the dataset is too large, the smallest 31 files are used for warm up (1 file) and benchmarking (30 files)\n", - "train_files = sorted(train_files, key=lambda x: os.path.getsize(x), reverse=False)[:31]\n", "weights_path = prepare_model_weights(root_dir=root_dir, bundle_name=\"wholeBody_ct_segmentation\")\n", "trt_model_name = \"model_trt.ts\"\n", "trt_model_path = prepare_tensorrt_model(root_dir, weights_path, trt_model_name)" @@ -609,13 +597,36 @@ "plt.legend()\n", "plt.show()" ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Limitations\n", + "\n", + "Although the optimizations have shown significant improvements in inference time, there are still some limitations to consider:\n", + "\n", + "1. **TensorRT**: \n", + " - **Model Compatibility**: Not all models are compatible with TensorRT. Models with unsupported layers or operations may not benefit from TensorRT acceleration.\n", + " - **Batch Size**: TensorRT is optimized for larger batch sizes. For very small batch sizes, the overhead of conversion and execution might outweigh the performance gains.\n", + " - **Precision**: While using lower precision (e.g., FP16) can speed up inference, it may lead to a loss in model accuracy, which is critical in medical imaging applications.\n", + "\n", + "2. **GPU-Based Preprocessing**:\n", + " - **Memory Usage**: The GPU-based preprocessing requires additional GPU memory. This can be a limitation if the available GPU memory is limited.\n", + "\n", + "3. **GPU Direct Storage (GDS)**:\n", + " - **File Format Support**: Currently, only specific file formats like NIfTI (for compressed `.nii.gz` NIFTI files, this feature also supports but the acceleration is not guaranteed) and DICOM are supported for direct GPU data loading. Other formats may not benefit from this feature.\n", + " - **Small File Acceleration**: For small files, the overhead of conversion and execution might outweigh the performance gains.\n", + "\n", + "By understanding these limitations, users can better assess when and how to apply these acceleration features effectively in their workflows." + ] } ], "metadata": { "kernelspec": { - "display_name": "kvikio_env", + "display_name": "monai_tutorial", "language": "python", - "name": "kvikio_env" + "name": "python3" }, "language_info": { "codemirror_mode": {