Skip to main content

LM Studio

Bad API Key.

Error Message Looks Like This

(This should never happen. LM Studio does not use an API Key.)

Reason

Corrupted file.

Solution

Delete and recreate the model.

Bad Model Name

Error Message Looks Like This

(This should never happen. LM Studio uses the model selected through their software.)

Reason

Corrupted file.

Solution

Delete and recreate the model.

Bad Parameter

Error Message Looks Like This

Exception: Error code: 400 - {'error': {'message': "This model's maximum context length is 16385 tokens. However, you requested 1000000000033 tokens (34 in the messages, 999999999999 in the completion). Please reduce the length of the messages or completion.", 'type': 'invalidrequesterror', 'param': 'messages', 'code': 'contextlengthexceeded'}}

Reason

The model name must be exactly as published by the Inference Provider.

Solution

Examine the parameter list. Not all providers use the same parameters, and not all models have the same parameters available.

Bad ProviderUrl

Error Message Looks Like This

Exception: Error code: 404

Reason

This should be something like http://localhost:1234/v1 or http://127.0.0.1:1234/v1

Solution

Check the configuration in LM Studio.

Corrupted API Template.

Error Message Looks Like This

Inference Failure. provider: 'LM StudioXXX' not found. Most likely due to corrupted API template.

Reason

This should not happen if the model was configured through the UI. Were you messing with the configuration JSON file directly?

Solution

Delete and recreate the model. Or revert to the backup of the JSON configuration that you made before messing with it.