Models API
To discover the available models on a Pico AI Homelab instance, use the Models API. Pico comes with two different endpoints for models: one is compatible with Ollama, the other
Pico AI Homelab will only list models that have been downloaded. Unlike Ollama, Pico AI Homelab does not permit client apps to download models directly from the internet. Instead, the Pico administrator can download models through the Pico Settings. This is because many chat clients have non-MLX models hardcoded, which are incompatible with Pico AI Homelab.
v1/models
OpenAI-compatible list of available modelsapi/tags
Ollama-compatible list of available modelsapi/ps
Endpoint added for Ollama compatibility. Returns an empty object.api/show
Endpoint added for Ollama compatibiltiy. Returns basic model information
OpenAI-compatible endpoint
GET
/v1/models
OpenAI-compatible list of available models
Headers
Content-Type
application/json
Body
This endpoint expects an empty body
Response
data
Array of objects
Array of model objects
object
String
Always list
id
String
Name of the model
created
Date (Unix timestamp)
Last modified date from Hugging Face (on 1.1.3 and newer)
object
String
Always model
owned_by
String
Always Pico AI Homelab
Ollama-compatible endpoints
GET
/api/tags
OpenAI-compatible list of available models
Body
This endpoint expects an empty body
Headers
Content-Type
application/json
Response
models
Array of objects
Array of model objects
name
String
The name of the model
size
Number
For now always set to 1000
modified_at
Date (ISO 8601)
Last modified date from Hugging Face (on 1.1.3 and newer)
digest
String
The SHA digest from Hugging Face
model
String
The model's repository name on HuggingFace
details
Object
format
String
Always returns mlx
parent_model
String
For now always returns an empty string
family
String
Model family name, e.g. DeepSeek R1
quantization_level
String
Quantization, e.g. fp16
, bf16
, 8bit
, 6bit
, 4bit
, 3bit
parameter_size
String
Parameter size of the model
Response
Ollama PS
GET
/api/ps
Stub for compatibilty with Ollama
Headers
Content-Type
application/json
Body
This endpoint expects an empty body
Response
parameters
String
Always empty
template
String
Always Unknown
license
String
Always Unknown
modelfile
String
Always Unknown
Response
Ollama Show
This endpoint returns nothing
Last updated