This is a React Native example app for testing the @novastera-oss/llamarn library, which provides on-device LLM inference capabilities.
Note: Make sure you have completed the Set Up Your Environment guide before proceeding.
Before running the example app, you need to download AI models to test with. The app expects GGUF format models.
Download GGUF models from HuggingFace or other sources. For testing, we recommend:
- For iOS: Larger models like Mistral-7B-Instruct-v0.3.Q4_K_M.gguf (~4.1GB)
- For Android: Smaller models like Llama-3.2-1B-Instruct-Q4_K_M.gguf (~770MB) due to build size limitations
Create an assets directory in the example folder and place your downloaded models there:
example/
├── assets/
│ ├── Llama-3.2-1B-Instruct-Q4_K_M.gguf
│ └── Mistral-7B-Instruct-v0.3.Q4_K_M.gguf
├── android/
├── ios/
└── src/
The example app includes scripts to automatically copy models to the correct platform-specific locations:
# Copy assets to both platforms
npm run copy-assets
# Or copy manually for specific platforms
# (This happens automatically before builds)Important Notes:
- iOS: All models are copied to the iOS bundle and must be added to Xcode project
- Android: Only models ≤1GB are copied to avoid build failures with large files
- The app automatically selects appropriate models per platform
First, you will need to run Metro, the JavaScript build tool for React Native.
To start the Metro dev server, run the following command from the root of your React Native project:
# Using npm
npm startWith Metro running, open a new terminal window/pane from the root of your React Native project, and use one of the following commands to build and run your Android or iOS app:
The build process automatically copies appropriate models to Android assets before building:
# Using npm
npm run androidFor iOS, remember to install CocoaPods dependencies (this only needs to be run on first clone or after updating native deps).
The first time you create a new project, run the Ruby bundler to install CocoaPods itself:
bundle installThen, and every time you update your native dependencies, run:
bundle exec pod installFor more information, please visit CocoaPods Getting Started guide.
The build process automatically copies models to iOS bundle before building:
# Using npm
npm run iosNote for iOS: After copying models, you may need to add them to your Xcode project's "Copy Bundle Resources" build phase manually if they don't appear automatically.
If everything is set up correctly, you should see your new app running in the Android Emulator, iOS Simulator, or your connected device.
This is one way to run your app — you can also build it directly from Android Studio or Xcode.
The example app provides a comprehensive test interface for the @novastera-oss/llamarn library:
- File Existence Check: Verify that model files are accessible on the device
- Model Info Loading: Get model metadata without full initialization
- Asset Management: Test platform-specific asset handling (iOS bundle vs Android assets)
- Check File Exists: Tests if the model file can be found in the platform-specific location
- Get Model Info: Loads model metadata (parameters, vocabulary size, context size, etc.)
- Platform Detection: Automatically selects appropriate models based on iOS/Android platform
The app demonstrates the cross-platform asset management system that handles:
- iOS: Direct bundle access with RNFS
- Android: Asset copying to cache directory for native access
Open src/ConsolidatedTestScreen.tsx to modify the test interface. The app will automatically update thanks to Fast Refresh.
When you want to forcefully reload:
- Android: Press the R key twice or select "Reload" from the Dev Menu, accessed via Ctrl + M (Windows/Linux) or Cmd ⌘ + M (macOS).
- iOS: Press R in iOS Simulator.
You've successfully run and tested the LLM functionality! 🥳
- If you want to add this new React Native code to an existing application, check out the Integration guide.
- If you're curious to learn more about React Native, check out the docs.
- Learn more about on-device AI solutions at Novastera.
Part of Novastera's suite of privacy-focused solutions, this package enables on-device LLM inference with no data leaving the user's device. We're committed to helping developers build AI-powered applications that respect user privacy.
If you're having issues getting the above steps to work, see the Troubleshooting page.
- iOS: Ensure models are added to Xcode project's "Copy Bundle Resources"
- Android: Check that models are ≤1GB or manually copy to cache directory
- Both: Verify model files are in the correct GGUF format
- Android: Large model files (>1GB) can cause build failures. Use smaller models or exclude them from assets
- iOS: Ensure sufficient device storage and memory for large models
To learn more about React Native and on-device AI, take a look at the following resources:
- React Native Website - learn more about React Native.
- Getting Started - an overview of React Native and how setup your environment.
- Learn the Basics - a guided tour of the React Native basics.
- Blog - read the latest official React Native Blog posts.
@facebook/react-native- the Open Source; GitHub repository for React Native.