This example app demonstrates both Apple Intelligence and MLC on-device AI capabilities.
Important
Before running this app, you need to build the MLC runtime binaries.
Navigate to the MLC package and run the build command for your target platform:
For iOS:
cd ../../packages/mlc
bun run build:runtime:iosFor Android:
cd ../../packages/mlc
bun run build:runtime:androidNote
The build process requires additional setup. Run ./scripts/build-runtime.sh --help in the MLC package directory to see detailed prerequisites for your platform.
After building the MLC runtime, navigate back to this directory and run:
iOS:
bun run iosAndroid:
bun run android- Apple Intelligence (iOS 17+): Text generation, embeddings, transcription, speech synthesis
- MLC Models: Run Llama, Phi, Mistral, and Qwen models on-device
- Tool calling and structured output support
- Streaming text generation
Warning
If you encounter runtime errors related to MLC:
- Ensure you've built the runtime binaries (see above)
- Run
npx expo prebuild --cleanif you've made configuration changes - Check that your device has sufficient memory for the model you're using (1-8GB)