📜 Table of Contents
🎥 Demo Video (App in Action)
In the following video, you can see the overall operation of the app (server settings, sending/receiving chats, using/deleting chat history).
✅ STEP 1: Configure Local LLM Server (LM Studio)
🎯 Objective
Start the local server (LM Studio) for the app to connect to and make it accessible within the LAN.
🔧 Steps (On your PC)
- Launch LM Studio
- Click on the [Developer] tab
- If [Status: Stopped], toggle [Start server] to ON
- Open [Settings] and turn ON [Serve on Local Network]
- Take note of the IP address displayed in [Reachable at:]
✅ STEP 2: Configure the App (iOS side)
🎯 Objective
Configure the app to connect to the local LLM server.
📱 Steps (On the App)
- Tap [No Models >] at the top center of the screen
- Select [LM Studio] or [Ollama] for the [API Provider]
- Enter the IP address from STEP 1 into the [Host] field
- If necessary, enter a custom port number in the [Port] field (Default for LM Studio: [1234])
- Tap [Test Connection] to verify the connection
- Once the connection is successful, select a [Language Model]
💬 Frequently Asked Questions (FAQ)
Q. Is an internet connection required?
A. No, this app operates within your local network and does not require an internet connection.
Q. Where can I review the tutorial again?
A. You can re-run it anytime from the [View Tutorial] button on the settings screen.
📝 Notes
- This app is intended for users who can set up their own local LLM server (Ollama / LM Studio).
- Please refer to the videos above for detailed operational checks.