8000 Improve behaviour when API host is given as a full URL · Issue #157 · lmstudio-ai/lmstudio-python · GitHub
[go: up one dir, main page]

Skip to content

Improve behaviour when API host is given as a full URL #157

@nullstreak

Description

@nullstreak

I have LM Studio 0.3.24 (GUI) running on another Windows 11 computer on my local network. It works just fine when I send OpenAI requests, but it fails when I use this library.

Here are some tests results:

Sending OpenAI request to `http://192.168.3.220:1234/v1`...
SUCCESS
----------------------------------------
Sending LM Studio request to `http://192.168.3.220:1234`...
LM Studio is not reachable at http://192.168.3.220:1234. Is LM Studio running?
FAILED
----------------------------------------
Sending LM Studio request to `http://192.168.3.220:1234/llm`...
LM Studio is not reachable at http://192.168.3.220:1234/llm. Is LM Studio running?
FAILED
----------------------------------------
Sending LM Studio request to `ws://192.168.3.220:1234`...
LM Studio is not reachable at ws://192.168.3.220:1234. Is LM Studio running?
FAILED
----------------------------------------
Sending LM Studio request to `ws://192.168.3.220:1234/llm`...
LM Studio is not reachable at ws://192.168.3.220:1234/llm. Is LM Studio running?
FAILED
----------------------------------------
Current Python version: 3.13.7

----------------------------------------
Pip show output:

Name: lmstudio
Version: 1.5.0
Summary: LM Studio Python SDK
Home-page: https://github.com/lmstudio-ai/lmstudio-sdk-python
Author: 
Author-email: LM Studio <team@lmstudio.ai>
License-Expression: MIT
Location: /home/vlad/Documents/Apps/lmstudio_test/venv/lib/python3.13/site-packages
Requires: anyio, httpx, httpx-ws, msgspec, typing-extensions
Required-by: 
----------------------------------------
Code that gives the output above
import lmstudio as lms
from openai import OpenAI
import subprocess
import sys
import platform

OPENAI_BASE_URL = "http://192.168.3.220:1234/v1"
HTTP_URL = "http://192.168.3.220:1234"
HTTP_URL_WITH_LLM_PATH = "http://192.168.3.220:1234/llm"
WS_URL = "ws://192.168.3.220:1234"
WS_URL_WITH_LLM_PATH = "ws://192.168.3.220:1234/llm"

def main():
    try:
        print(f"Sending OpenAI request to `{OPENAI_BASE_URL}`...")
        OpenAI(base_url=OPENAI_BASE_URL, api_key="dummy").models.list()
        print("SUCCESS")
    except Exception as e:
        print(e)
        print("FAILED")
    print('-' * 40)

    try:
        print(f"Sending LM Studio request to `{HTTP_URL}`...")
        lms.configure_default_client(HTTP_URL)
        lms.list_downloaded_models()
        print("SUCCESS")
    except Exception as e:
        print(e)
        print("FAILED")
    finally:
        lms.sync_api._reset_default_client()
    print('-' * 40)

    try:
        print(f"Sending LM Studio request to `{HTTP_URL_WITH_LLM_PATH}`...")
        lms.configure_default_client(HTTP_URL_WITH_LLM_PATH)
        lms.list_downloaded_models()
        print("SUCCESS")
    except Exception as e:
        print(e)
        print("FAILED")
    finally:
        lms.sync_api._reset_default_client()
    print('-' * 40)

    try:
        print(f"Sending LM Studio request to `{WS_URL}`...")
        lms.configure_default_client(WS_URL)
        lms.list_downloaded_models()
        print("SUCCESS")
    except Exception as e:
        print(e)
        print("FAILED")
    finally:
        lms.sync_api._reset_default_client()
    print('-' * 40)

    try:
        print(f"Sending LM Studio request to `{WS_URL_WITH_LLM_PATH}`...")
        lms.configure_default_client(WS_URL_WITH_LLM_PATH)
        lms.list_downloaded_models()
        print("SUCCESS")
    except Exception as e:
        print(e)
        print("FAILED")
    finally:
        lms.sync_api._reset_default_client()
    print('-' * 40)

    print(f"Current Python version: {platform.python_version()}\n")
    print('-' * 40)

    print("Pip show output:\n")
    subprocess.run([sys.executable, "-m", "pip", "show", "lmstudio"])
    print('-' * 40)

if __name__ == "__main__":
    main()

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0