Skip to content

feat: add C++ JSI interface and Executorch dependency #184

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 39 commits into from
Closed
Show file tree
Hide file tree
Changes from 33 commits
Commits
Show all changes
39 commits
Select commit Hold shift + click to select a range
9ca8288
feat: Add image segmentation example (#138)
JakubGonera Mar 24, 2025
7ec9f07
docs: Add documentation for image segmentation (#149)
JakubGonera Mar 27, 2025
346f74d
feat: add support for hugging face tokenizers, add executorch and tok…
NorbertKlockiewicz Mar 27, 2025
2145796
feat: add option to pass skip special tokens flag to decode (#162)
NorbertKlockiewicz Mar 31, 2025
a8e7f96
feat: Add multilingual Whisper (#166)
chmjkb Apr 1, 2025
6c8a629
fix: Change BOS/EOS token ids for non-multilingual Whisper in model c…
chmjkb Apr 1, 2025
a245872
fix: pass skipSpecialTokens param to Tokenizer in _TokenizerModule (#…
chmjkb Apr 3, 2025
6e210bd
feat: bump executorch 0.6.0 (#171)
NorbertKlockiewicz Apr 14, 2025
6c58d9f
Add spell checking to codebase (#185)
pweglik Apr 14, 2025
9b19a27
chore: bump model urls to use v0.4.0 tag (#169)
chmjkb Apr 15, 2025
4fb680d
deps: bump React Native versions in demo apps (#189)
chmjkb Apr 16, 2025
b8c15af
chore: @jakmro/update aar (#196)
jakmro Apr 16, 2025
093a32b
docs: add docs for mutlilingual Whisper (#176)
chmjkb Apr 16, 2025
8391067
feat: multilingual ocr (#192)
NorbertKlockiewicz Apr 22, 2025
4a5dff0
fix: error handling (#197)
pweglik Apr 22, 2025
deb768a
feat: text embeddings (#163)
jakmro Apr 23, 2025
eda47a5
feat: Add markdown spell-checker (#204)
pweglik Apr 24, 2025
52ecf0f
fix: (iOS) prevent exceeding kMaxContextLen from crashing the app (#207)
chmjkb Apr 24, 2025
9100418
chore: Add more models to modelUrls.ts (#215)
NorbertKlockiewicz Apr 25, 2025
61fd540
fix: new executorch android runtime with fix for linear op (#216)
NorbertKlockiewicz Apr 25, 2025
21d9b04
feat: Support tokenizer config and add tool support.
pweglik Apr 28, 2025
e7d9563
demo: Example app with calendar assistant using tool calling (+ docs …
pweglik Apr 28, 2025
b86d267
feat: Add ET library for direct use in C++
JakubGonera Apr 11, 2025
3a6a839
fix: Revert change development team change
JakubGonera Apr 11, 2025
b6b1004
fix: remove ifs for old architecture in cmake
JakubGonera Apr 14, 2025
e679a71
fix: remove ETInstaller module wrapper
JakubGonera Apr 14, 2025
1bab81b
fix: omit new arch checks in cmake
JakubGonera Apr 14, 2025
512b3b6
fix: remove new arch checks from gradle
JakubGonera Apr 14, 2025
c1df544
chore: remove additional executorch archive
JakubGonera Apr 17, 2025
a9745bd
chore: resolve et 0.6.0 conflict
JakubGonera Apr 17, 2025
fde47dd
chore: update c++/ios ET libs
JakubGonera Apr 17, 2025
08b71a8
fix: remove unnecessary export
JakubGonera Apr 30, 2025
a98655b
Increase log messages and add ellipsis
JakubGonera May 14, 2025
1d3224c
feat: port style transfer implementation to C++ (#229)
JakubGonera May 20, 2025
1a567c3
fix: fix exception handling in C++ native code (#244)
JakubGonera May 20, 2025
a7e3e4e
fix: refactor C++ JSI promises (#286)
JakubGonera May 20, 2025
e5c97ad
fix: generic loading of host functions in C++ (#315)
JakubGonera May 21, 2025
0442aa3
fix: make the host function installation generic (#327)
JakubGonera May 23, 2025
24f4d2a
feat: port image segmentation native code to C++ (#313)
JakubGonera May 27, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
48 changes: 48 additions & 0 deletions .cspell-wordlist.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
swmansion
executorch
execu
Execu
torch
huggingface
bbox
bboxes
deeplab
unsqueeze
qlora
spinquant
efficientnet
ssdlite
udnie
crnn
mobilenet
microcontrollers
notimestamps
seqs
smollm
qwen
XNNPACK
EFFICIENTNET
SSDLITE
MOBILENET
UDNIE
CRNN
SPINQUANT
QLORA
GGUF
deeplabv
DEELABV
ARGMAX
Abaza
Adyghe
Chech
Dargwa
Ingush
Karbadian
Lezghian
Occitan
Tabassaran
Sinhala
Infima
sublabel
Aeonik
Lexend
12 changes: 12 additions & 0 deletions .cspell.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
{
"version": "0.2",
"language": "en",
"ignorePaths": ["**/node_modules", "**/Pods"],
"dictionaryDefinitions": [
{
"name": "project-words",
"path": ".cspell-wordlist.txt"
}
],
"dictionaries": ["project-words"]
}
38 changes: 38 additions & 0 deletions .eslintrc
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
{
"parserOptions": {
"requireConfigFile": false,
"babelOptions": {
"presets": [
"@babel/preset-react"
]
}
},
"root": true,
"extends": [
"@react-native",
"prettier",
"plugin:@cspell/recommended"
],
"rules": {
"react/react-in-jsx-scope": "off",
"prettier/prettier": [
"error",
{
"quoteProps": "consistent",
"singleQuote": true,
"tabWidth": 2,
"trailingComma": "es5",
"useTabs": false
}
],
"@cspell/spellchecker": ["warn", { "customWordListFile": ".cspell-wordlist.txt" }]

},
"plugins": [
"eslint-plugin-prettier"
],
"ignorePatterns": [
"node_modules/",
"lib/"
]
}
Original file line number Diff line number Diff line change
@@ -1,27 +1,27 @@
name: Llama Example app Android build check
name: LLM Example app Android build check
on:
pull_request:
paths:
- .github/workflows/build-android-llama-example.yml
- .github/workflows/build-android-llm-example.yml
- android/**
- third-party/android/**
- examples/llama/package.json
- examples/llama/android/**
- examples/llm/package.json
- examples/llm/android/**
push:
branches:
- main
paths:
- .github/workflows/build-android-llama-example.yml
- .github/workflows/build-android-llm-example.yml
- android/**
- third-party/android/**
- examples/llama/package.json
- examples/llama/android/**
- examples/llm/package.json
- examples/llm/android/**
jobs:
build:
if: github.repository == 'software-mansion/react-native-executorch'
runs-on: ubuntu-latest
env:
WORKING_DIRECTORY: examples/llama
WORKING_DIRECTORY: examples/llm
concurrency:
group: android-${{ github.ref }}
cancel-in-progress: true
Expand Down
Original file line number Diff line number Diff line change
@@ -1,19 +1,19 @@
name: Llama Example app iOS build check
name: LLM Example app iOS build check
on:
push:
branches:
- main
paths:
- '.github/workflows/build-ios-llama-example.yml'
- '.github/workflows/build-ios-llm-example.yml'
- '*.podspec'
- 'examples/llama/ios/**'
- 'examples/llama/package.json'
- 'examples/llm/ios/**'
- 'examples/llm/package.json'
pull_request:
paths:
- '.github/workflows/build-ios-llama-example.yml'
- '.github/workflows/build-ios-llm-example.yml'
- '*.podspec'
- 'examples/llama/ios/**'
- 'examples/llama/package.json'
- 'examples/llm/ios/**'
- 'examples/llm/package.json'
jobs:
build:
if: github.repository == 'software-mansion/react-native-executorch'
Expand All @@ -25,17 +25,17 @@ jobs:
- name: Check out Git repository
uses: actions/checkout@v4
- name: Install node dependencies
working-directory: examples/llama
working-directory: examples/llm
run: yarn
- name: Install pods
working-directory: examples/llama/ios
working-directory: examples/llm/ios
run: pod install
- name: Build app
working-directory: examples/llama/ios
working-directory: examples/llm/ios
run: |
set -o pipefail && xcodebuild \
-workspace llama.xcworkspace \
-scheme llama \
-workspace llm.xcworkspace \
-scheme llm \
-sdk iphonesimulator \
-configuration Debug \
-destination 'platform=iOS Simulator,name=iPhone 16 Pro' \
Expand Down
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -80,3 +80,7 @@ lib/
# React Native Codegen
ios/generated
android/generated

# custom
*.tgz
Makefile
6 changes: 6 additions & 0 deletions .gitmodules
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
[submodule "executorch"]
path = third-party/executorch
url = https://github.com/software-mansion-labs/executorch
[submodule "tokenizers-cpp"]
path = third-party/tokenizers-cpp
url = https://github.com/software-mansion-labs/tokenizers-cpp
7 changes: 7 additions & 0 deletions .prettierrc
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
{
"quoteProps": "consistent",
"singleQuote": true,
"tabWidth": 2,
"trailingComma": "es5",
"useTabs": false
}
19 changes: 10 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ React Native Executorch supports only the [New React Native architecture](https:

If your app still runs on the old architecture, please consider upgrading to the New Architecture.

## Readymade models 🤖
## Ready-made models 🤖

To run any AI model in ExecuTorch, you need to export it to a `.pte` format. If you're interested in experimenting with your own models, we highly encourage you to check out the [Python API](https://pypi.org/project/executorch/). If you prefer focusing on developing your React Native app, we will cover several common use cases. For more details, please refer to the documentation.

Expand Down Expand Up @@ -43,16 +43,17 @@ Add this to your component file:

```tsx
import {
LLAMA3_2_3B_QLORA,
LLAMA3_2_3B_TOKENIZER,
useLLM,
LLAMA3_2_1B,
LLAMA3_2_TOKENIZER_CONFIG,
} from 'react-native-executorch';

function MyComponent() {
// Initialize the model 🚀
const llama = useLLM({
modelSource: LLAMA3_2_3B_QLORA,
tokenizerSource: LLAMA3_2_3B_TOKENIZER,
modelSource: LLAMA3_2_1B,
tokenizerSource: LLAMA3_2_TOKENIZER,
tokenizerConfigSource: LLAMA3_2_TOKENIZER_CONFIG,
});
// ... rest of your component
}
Expand All @@ -67,8 +68,8 @@ const handleGenerate = async () => {
const prompt = 'The meaning of life is';

// Generate text based on your desired prompt
const response = await llama.generate(prompt);
console.log('Llama says:', response);
await llama.runInference(prompt);
console.log('Llama says:', llama.response);
};
```

Expand All @@ -84,9 +85,9 @@ We currently host two example apps demonstrating use cases of our library:

- examples/speech-to-text - Whisper and Moonshine models ready for transcription tasks
- examples/computer-vision - computer vision related tasks
- examples/llama - chat applications showcasing use of LLMs
- examples/llm - chat applications showcasing use of LLMs

If you would like to run it, navigate to it's project directory, for example `examples/llama` from the repository root and install dependencies with:
If you would like to run it, navigate to it's project directory, for example `examples/llm` from the repository root and install dependencies with:

```bash
yarn
Expand Down
17 changes: 17 additions & 0 deletions android/CMakeLists.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
cmake_minimum_required(VERSION 3.13)
project(RnExecutorch)

set (CMAKE_VERBOSE_MAKEFILE ON)
set (CMAKE_CXX_STANDARD 20)

include("${REACT_NATIVE_DIR}/ReactAndroid/cmake-utils/folly-flags.cmake")
add_compile_options(${folly_FLAGS})

string(APPEND CMAKE_CXX_FLAGS " -DRCT_NEW_ARCH_ENABLED")

set(ANDROID_CPP_DIR "${CMAKE_SOURCE_DIR}/src/main/cpp")
set(COMMON_CPP_DIR "${CMAKE_SOURCE_DIR}/../common")
set(ET_LIB_DIR "${CMAKE_SOURCE_DIR}/../third-party/android/libs")
set(ET_INCLUDE_DIR "${CMAKE_SOURCE_DIR}/../third-party/include")

add_subdirectory("${ANDROID_CPP_DIR}")
Loading