Working with AI
Working with LM Studio
Learn how to connect your local LMStudio to Octarine to use Writing Assistant and Ask Octarine
This guide provides step-by-step instructions to install LM Studio, load a model, and connect it to Octarine. LM Studio enables you to run large language models (LLMs) on your computer locally—no cloud required.
Prerequisites
- Operating System: Compatible with macOS or Windows (Linux support is experimental).
- Hardware Requirements: Sufficient CPU/RAM/Storage are recommended, especially for larger models.
Step 1: Install LM Studio
-
Download LM Studio:
- Go to the LM Studio website and download the installer for your operating system.
-
Install LM Studio:
- macOS: Open the downloaded
.dmgfile and follow the prompts to install. - Windows: Run the downloaded
.exeand complete the installation wizard. - Linux: Check the official website for the latest experimental instructions.
- macOS: Open the downloaded
Step 2: Download a Model
After installation, launch LM Studio.
-
Open LM Studio:
- Start the application from your applications folder or start menu.
-
Find & Download a Model:
- Browse the built-in model library or use the search bar to find a model (e.g.,
Llama 2,Mistral,Phi-3, etc.). - Click Download next to your chosen model. Wait for the download to complete.
- Browse the built-in model library or use the search bar to find a model (e.g.,
Step 3: Start the Local Server
LM Studio can act as a local server so Octarine can connect to it.
- Launch the API Server:
- In LM Studio, go to the API tab (usually on the sidebar).
- Click Start Server (typically defaults to
http://localhost:1234). - Take note of the server URL displayed.
Step 4: Setting it up in Octarine
You’re now ready to connect LM Studio with Octarine.
- Go to
Settings → AI Assistant → AI Providers - Click on LM Studio
- Enter the local server URL (from Step 3) and press Save.
Step 5: Using Models in Writing Assistant/Ask Octarine
- Open Writing Assistant or Ask Octarine
- Access the model selector and look for
LM Studiomodels - Select your preferred model and start generating!
Additional Considerations
- Security: By default, the LM Studio API only accepts requests from
localhost, making it safe for local-only development. - Model Management: To manage your models, use the LM Studio interface to delete or update installed models.
- Stopping the Server: In LM Studio, return to the API tab and click Stop Server; or simply close the LM Studio application.