Skip to content

llama-server <embedding> exited with status code -1 #3056

@Gnomesenpai

Description

@Gnomesenpai

Describe the bug
llama-server exited with status code -1

Information about your version
Unable to get version as it will not start. Docker image used:

REPOSITORY                        TAG                                IMAGE ID       CREATED         SIZE
tabbyml/tabby                     latest                             bc5a49b31c6f   6 days ago      2.64GB

Information about your GPU

Tue Sep  3 20:27:02 2024
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 550.54.14              Driver Version: 550.54.14      CUDA Version: 12.4     |
|-----------------------------------------+------------------------+----------------------+
| GPU  Name                 Persistence-M | Bus-Id          Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |           Memory-Usage | GPU-Util  Compute M. |
|                                         |                        |               MIG M. |
|=========================================+========================+======================|
|   0  Tesla P4                       On  |   00000000:13:00.0 Off |                  Off |
| N/A   67C    P0             25W /   75W |     795MiB /   8192MiB |      0%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+

Additional context
I removed my old Tabby setup and pulled the new container and set a new data folder however it fails with the error:

`Starting...2024-09-03T19:25:59.506655Z  WARN llama_cpp_server::supervisor: crates/llama-cpp-server/src/supervisor.rs:98: llama-server <embedding> exited with status code -1, args: `Command { std: "/opt/tabby/bin/llama-server" "-m" "/data/models/TabbyML/Nomic-Embed-Text/ggml/model.gguf" "--cont-batching" "--port" "30888" "-np" "1" "--log-disable" "--ctx-size" "4096" "-ngl" "9999" "--embedding" "--ubatch-size" "4096", kill_on_drop: true }`
`

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions