LogoOctarine Docs
Working with AI

Working with LM Studio

Learn how to connect your local LMStudio to Octarine to use Writing Assistant and Ask Octarine

This guide provides step-by-step instructions to install LM Studio, load a model, and connect it to Octarine. LM Studio enables you to run large language models (LLMs) on your computer locally—no cloud required.

Prerequisites

  • Operating System: Compatible with macOS or Windows (Linux support is experimental).
  • Hardware Requirements: Sufficient CPU/RAM/Storage are recommended, especially for larger models.

Step 1: Install LM Studio

  1. Download LM Studio:

  2. Install LM Studio:

    • macOS: Open the downloaded .dmg file and follow the prompts to install.
    • Windows: Run the downloaded .exe and complete the installation wizard.
    • Linux: Check the official website for the latest experimental instructions.

Step 2: Download a Model

After installation, launch LM Studio.

  1. Open LM Studio:

    • Start the application from your applications folder or start menu.
  2. Find & Download a Model:

    • Browse the built-in model library or use the search bar to find a model (e.g., Llama 2, Mistral, Phi-3, etc.).
    • Click Download next to your chosen model. Wait for the download to complete.

Step 3: Start the Local Server

LM Studio can act as a local server so Octarine can connect to it.

  1. Launch the API Server:
    • In LM Studio, go to the API tab (usually on the sidebar).
    • Click Start Server (typically defaults to http://localhost:1234).
    • Take note of the server URL displayed.

Step 4: Setting it up in Octarine

You’re now ready to connect LM Studio with Octarine.

  1. Go to Settings → AI Assistant → AI Providers
  2. Click on LM Studio
  3. Enter the local server URL (from Step 3) and press Save.

Step 5: Using Models in Writing Assistant/Ask Octarine

  1. Open Writing Assistant or Ask Octarine
  2. Access the model selector and look for LM Studio models
  3. Select your preferred model and start generating!

Additional Considerations

  • Security: By default, the LM Studio API only accepts requests from localhost, making it safe for local-only development.
  • Model Management: To manage your models, use the LM Studio interface to delete or update installed models.
  • Stopping the Server: In LM Studio, return to the API tab and click Stop Server; or simply close the LM Studio application.