How to Use the Local LLM Client

App Version:v1.1.1

Last Update:2025-08-17

This page explains the setup procedures required to use the Local LLM Client app.
It provides a step-by-step guide on starting the LM Studio server and configuring the app-side connection.

📜 Table of Contents

🎥 Demo Video (App in Action)

In the following video, you can see the overall operation of the app (server settings, sending/receiving chats, using/deleting chat history).

✅ STEP 1: Configure Local LLM Server (LM Studio)

🎯 Objective

Start the local server (LM Studio) for the app to connect to and make it accessible within the LAN.

🔧 Steps (On your PC)

  1. Launch LM Studio
  2. Click on the [Developer] tab
  3. If [Status: Stopped], toggle [Start server] to ON
  4. Open [Settings] and turn ON [Serve on Local Network]
  5. Take note of the IP address displayed in [Reachable at:]

✅ STEP 2: Configure the App (iOS side)

🎯 Objective

Configure the app to connect to the local LLM server.

📱 Steps (On the App)

  1. Tap [No Models >] at the top center of the screen
  2. Select [LM Studio] or [Ollama] for the [API Provider]
  3. Enter the IP address from STEP 1 into the [Host] field
  4. If necessary, enter a custom port number in the [Port] field (Default for LM Studio: [1234])
  5. Tap [Test Connection] to verify the connection
  6. Once the connection is successful, select a [Language Model]

💬 Frequently Asked Questions (FAQ)

Q. Is an internet connection required?

A. No, this app operates within your local network and does not require an internet connection.


Q. Where can I review the tutorial again?

A. You can re-run it anytime from the [View Tutorial] button on the settings screen.

📝 Notes

  • This app is intended for users who can set up their own local LLM server (Ollama / LM Studio).
  • Please refer to the videos above for detailed operational checks.