Skip to content

Releases: xlite-dev/lite.ai.toolkit

v0.3.4

02 Jul 01:42
a5bdb02
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.3.3...v0.3.4

v0.3.3

28 Apr 06:08
33a4c3a
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.3.2...v0.3.3

v0.3.2

10 Apr 06:03
cd89b65
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.3.1...v0.3.2

v0.3.1.post1

05 Feb 07:42
4af23c2
Compare
Choose a tag to compare

v0.3.1

02 Dec 05:31
aec2f7c
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: DefTruth/lite.ai.toolkit@v0.2.0...v0.3.1

v0.3.0 Linux GPU: TensorRT

14 Oct 04:56
d4af41c
Compare
Choose a tag to compare

New Features: NVIDIA GPU Inference support via TensorRT

🎉🎉TensorRT: Boost inference performance with NVIDIA GPU via TensorRT.

Run bash ./build.sh tensorrt to build lite.ai.toolkit with TensorRT support, and then test yolov5 with the codes below. NOTE: lite.ai.toolkit need TensorRT 10.x (or later) and CUDA 12.x (or later). Please check build.sh, tensorrt-linux-x86_64-install.zh.md, test_lite_yolov5.cpp and NVIDIA/TensorRT for more details.

// trtexec --onnx=yolov5s.onnx --saveEngine=yolov5s.engine
auto *yolov5 = new lite::trt::cv::detection::YOLOV5(engine_path);
Class Class Class Class Class System Engine
YOLOv5 YOLOv6 YOLOv8 YOLOv8Face YOLOv5Face Linux TensorRT
YOLOX YOLOv5BlazeFace StableDiffusion / / Linux TensorRT

What's Changed

New Contributors

Full Changelog: DefTruth/lite.ai.toolkit@v0.2.0...v0.3.0-rc1

v0.2.0 Linux CPU: ONNXRuntime + MNN

21 Mar 01:38
a82ee1f
Compare
Choose a tag to compare

v0.2.0-rc3

19 Mar 14:49
Compare
Choose a tag to compare

v0.2.0-rc2

19 Mar 05:57
f7b4edf
Compare
Choose a tag to compare

v0.2.0-rc1: Merge pull request #399 from DefTruth/linux-dev

18 Mar 15:04
5f14d0a
Compare
Choose a tag to compare