Native LLMs for mobile apps

Ship multimodal AI experiences with streaming results.

Amaryllis is a modern React Native AI module that powers on-device inference, supports text + images, and streams tokens through hooks and observables.

Install
npm install react-native-amaryllis
# or
yarn add react-native-amaryllis
# or
pnpm add react-native-amaryllis
Android iOS Streaming Multimodal

What you get out of the box

Native performance with a developer friendly API.

Native LLM engine

Optimized pipelines for Android and iOS with predictable startup.

Multimodal support

Send prompts with images for grounded, visual responses.

Streaming hooks

Use hooks and observables to render partial tokens fast.

Context provider

Centralize configuration for the entire React Native app.

LoRA customization

Fine tune behavior with LoRA adapters on GPU devices.

Developer control

Cancel and manage sessions with explicit APIs.

Best practices for production apps

Keep performance, memory, and safety predictable.

Stream by default

Show partial tokens early for a responsive UX.

Cancel on unmount

Always cancel async inference in cleanup handlers.

Constrain inputs

Limit image count and size to protect memory.

Validate file paths

Prevent invalid or unsafe native file access.

Handle errors explicitly

Surface custom error types and graceful fallbacks.

Document model sources

Track model versions and update strategies.

Quickstart

Wrap your app, then generate from hooks.

Provider setup
import { LLMProvider } from 'react-native-amaryllis';

<LLMProvider
  config={{
    modelPath: 'gemma3-1b-it-int4.task',
    visionEncoderPath: 'mobilenet_v3_small.tflite',
    visionAdapterPath: 'mobilenet_v3_small.tflite',
    maxTopK: 32,
    maxNumImages: 2,
    maxTokens: 512,
  }}
>
  {/* App components */}
</LLMProvider>
Streaming inference
import { useInferenceAsync } from 'react-native-amaryllis';

const generate = useInferenceAsync({
  onResult: (chunk, isFinal) => {
    // Update UI with tokens
  },
  onError: (err) => setError(err),
});

await generate({ prompt, images });

Live demo preview

Sample UI built on top of Amaryllis streaming hooks.